I've been trying to finish up a web scraper for a website of mine and have hit a stumbling block on the folder creation for image storage
My code looks like this:
//ftpUser and ftpPass are set at the head of the class
FtpWebRequest lrgRequest = (FtpWebRequest) FtpWebRequest.Create("ftp://ftp.mysite.com/httpdocs/images/large/imgFolder");
lrgRequest.Credentials = new NetworkCredential(ftpUser, ftpPass);
lrgRequest.KeepAlive = false;
lrgRequest.Method = WebRequestMethods.Ftp.MakeDirectory;
FtpWebResponse response = (FtpWebResponse) lrgRequest.GetResponse();
Console.WriteLine(response);
When i run this code it gets to the response and throws the error 550 saying the folder isn't found
I've compared my approach to a number of examples and by the standard approach it should work. The ftp address is valid and has been checked and i'm wondering is there an issue with my server that is stopping this or is my C# causing the problem
If anyone needs any more info please just say
As always any help is greatly appreciated
Regards
Barry
Definition of FTP 550:
Requested action not taken. File unavailable (e.g., file not found, no
access).
I'd check you have the appropriate permissions (or you app does) and verfiy that the file does indeed exist.
Since you are getting a response code, I doubt your code above is the cause of the problem. However you could always check the AuthenticationLevel and ImpersonationLevel to see if these provide any useful information.
Related
I am new to SophosLabs Intelix. I am trying to build a sample in my ASP .Net Application(webforms/MVC) in which I want to run an Antivirus Scan on the uploaded file by the User. If the Uploaded file is clean I want to upload it to the server else I want to cancel the operation. I want to specifically use SophosLabs Intelix for the functionality. It would be great if someone can guide me regarding this functionality. A code sample in C# would be appreciated a lot.
Thanks in advance for your help.
Sample:
if(file.HasFile)
{
//run an antivirus scan
//result
if(result == NoThreat){
//Uploaded Successfully
}
else{
//File contains a virus. Upload failed!
}
}
else{
//Please select a file to upload!
}
I suggest to start with the implementation of OAUTH 2 authentication request. You can find some ideas here: How do I get an OAuth 2.0 authentication token in C#
As soon as you have the access_token you can use if for /reports/?sha256=... query.
It may return the report immediately.
If it does not return any data (404) this request was free and you can POST the file to the root endpoint "/" for analysis.
It can take a few seconds/minutes, during that you should poll the report from /reports/{job_id} endpoint as long as you get it.
If you cannot wait minutes for decision data, you may use the File Hash Lookup API as well that returns immediately.
It may give a reputationScore between 30..69 so cannot decide how dangerous the file is, but in this case you can still perform a static or dynamic analysis on it.
I currently maintain a nearly 3 years old ASP .Net MVC website, the application is running above IIS (now in IIS 7) and using ASP .Net 4 Framework. It used by client almost everyday and had a lot of upload-download file transaction. It also use ELMAH as Unhandled Exception Handling. The application running well until a few past month, there are a lot of report from user that they cannot do download the file, but without any error message, the download process just not doing anything while there is also no log in Browser Console. After doing several checking, all menu that have download function are using http response
Response.Clear();
Response.Cache.SetCacheability(HttpCacheability.Private);
Response.Expires = -1;
Response.Buffer = true;
Response.ContentType = "application/octet-stream";
Response.AddHeader("Content-Length", Convert.ToString(file_in_bytes.Length));
Response.AddHeader("Content-Disposition"
, string.Format("{0};FileName=\"{1}\"", "attachment", fileName));
Response.AddHeader("Set-Cookie", "fileDownload=true; path=/");
Response.BinaryWrite(hasil);
Response.End();
And nothing seems wrong (there are no Compile or Runtime Error in Development Server). We've also checked Elmah's log, but there no related error message appear in there. And This problem is temporarily disappear after our Server Management Team do Recycling the Application Pool in IIS.
This Web is also share Application Pool with another web, and when that error occurred, both application are affected, only the download function that affected, the other function like data retrieval from database, insert/edit/delete data is working fine.
I also checked the Web Server Event Viewer but there is nothing error in there. The very odd thing for us is that this error temporary disappear after we Recycling the Application Pool and after several days or weeks or months the error suddenly appear again.
Is there any log that we've missed to trace? or perhaps there is wrong with the Download code? And why its temporarily fixed after Recycling Application Pool?
Another Note : The data that need to be download by user is at average 500kb to 2MB in zip format contains several PDF files
Update : After few more hour investigating, I found that this web application using different method to Download, some are using the Http.Response like above code, and some are use FileContentResult as return value. But both using jquery.FileDownload in client-side. I also found this method in several Controller that has Download File method in this app,
private void CheckAndHandleFileResult(ActionExecutedContext filterContext)
{
var httpContext = filterContext.HttpContext;
var response = httpContext.Response;
if (filterContext.Result is FileContentResult)
{
//jquery.fileDownload uses this cookie to determine that
//a file download has completed successfully
response.AppendCookie(new HttpCookie(CookieName, "true")
{ Path = CookiePath });
}
else
{
//ensure that the cookie is removed in case someone did
//a file download without using jquery.fileDownload
if (httpContext.Request.Cookies[CookieName] != null)
{
response.AppendCookie(new HttpCookie(CookieName, "true")
{ Expires = DateTime.Now.AddYears(-1), Path = CookiePath });
}
}
}
Actually I'm not really sure is that method related to this error or not, but it is called in a method that override System.Web.MVC.Controller OnActionExecuted, and it contain the line off adding Cookie for file download if using FileContentResult or delete Cookie if is not using FileContentResult and file Download Cookie is exists. It is Possible if Cookie is Accidentally not deleted / cleared after it created? And because the download method is frequently called by nearly 100 user everyday, it is possible that the Cookie is pile up and cause IIS Worker Process Crash?
I've also checked some references about Cookie and its relation to IIS Session State (My Apps using In-Proc State). Am I Close? Or did I miss something?
Is there a reason why Response.Buffer is set to true? When buffering is enabled, the response is sent only after all processing is completed. Can you disable it by setting to false and see if this works? This could be the reason for having to recycle the app pool. You can also check if you are facing these issues - Help link
I am trying to moving a file from one folder to another using FtpWebRequest but i keep getting error 550. This is my code;
var requestMove = (FtpWebRequest)WebRequest.Create(Helper.PathFtp + Helper.NewFolder + file);
requestMove.Method = WebRequestMethods.Ftp.Rename;
requestMove.Credentials = networkCredential;
requestMove.RenameTo = "../" + Helper.OldFolder + file;
requestMove.GetResponse();
I can list, upload, download and delete files but moving/renaming is hopeless. I have read several posts both on stackoverflow and other sites and have tried things like setting Proxy to null and adding special characters to paths but I cant find a solution that works.
The path I use in WebRequest.Create is correct as I can delete it so it must be the RenameTo I got an issue with. Any ideas?
Error 550 means access denied. If the ftp user has sufficient rights, a program (e.g. antivirus, windows thumbnail generator etc) could have the file opened and deny your move request.
You need to contact the server administrator to get around the problem.
I have to create this query to get some answer before i change my code.Pardon me if this question is doesn't make sense to you guys.
Scenario 1:
string path :ftp://1.1.1.1/mpg/test";
FtpWebRequest requestDir = (FtpWebRequest)FtpWebRequest.Create(new Uri(path));
requestDir.Credentials = new NetworkCredential("sh","se");
requestDir.Method = WebRequestMethods.Ftp.MakeDirectory;
Using the same code to create the directory structure to connect my local Filezilla ftp server to do the job---Works Fine.
Scenario 2:
Used the above code to connect the remote ftp server to do the same job throws exception : Error 550 no file found or no Access.
Question 1 : I have a full permission to read/write for the folder,if its not a permission issue,what else i have to keep it in mind to look for it ?
Question 2: If i modified my code like
step1: Make"mpg" direcotry first
step2: make"test" directory after that,works fine
is that mean FTP.Makedirectory won't support to create a subdirectory in the main dir ?
If that's the case how it created in my local ftp server ?
Any help appreciated.
Thanks in Advance.
I dont know why exactly, but some ftp servers response differently. I know because this scenario exactly happened to me while developing dropf. If you are using Filezilla ftp server, create with subdirectories working, but not on IIS ftp server.
So i suggest you to write one method for creating directory, method takes one parameter ant it will be full path, in your scenario (mpg/test), and split it by '/' and create one by one every directory. This way it works on Filezilla and IIS Ftp and some other ftp services.
I'm having a problem that's been bugging me for a while.
I'm downloading files from a FTP server in .net, and randomly (and I insist, it is completely random), I get the following error:
System.Net.WebException: The remote server returned an error: (550) File unavailable (e.g., file not found, no access).
Our code in .net implements a retry mecanism, so when this error happens, the code will to download all the files again. Then, sometimes, it will succeed, other times, the 550 error will happen on another file, sometimes on the same file, it is completely random.
We is a snippet of the DownloadFile method that is called for each files to be downloaded
byte[] byWork = new byte[2047];
...
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(new Uri(_uri.ToString() + "/" + filename));
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.Credentials = new NetworkCredential(_Username, _Password);
using (FtpWebResponse response = (FtpWebResponse)request.GetResponse())
{
using (Stream rs = response.GetResponseStream())
{
using (FileStream fs = new FileStream(destination, FileMode.Create))
{
do
{
iWork = rs.Read(byWork, 0, byWork.Length);
fs.Write(byWork, 0, iWork);
} while (iWork != 0);
fs.Flush();
}
}
}
again, the thing that bugs me is that if there is an error in this code, the 550 error would happen everytime. However, we can try to download a file, we get the error, we try to download the same file with the same parameters again, and it will work. And it seams to happen more frequently with larger files. Any idea?
Please note, the below is just anecdotal, I don't have anything except vague memories and assumptions to back it up. So rather than a real solution, just take it as a "cheer up, it might not be your fault at all".
I think 550 errors are more likely to be due to some issue with the server rather than the client. I remember getting 550 errors quite often when using an old ISP's badly maintained ftp server, and I did try various clients without it making any real difference. I also remember seeing other people posting messages about similar problems for the same and other servers.
I think the best way to handle it is to just retry the download automatically and hopefully after a few tries you'll get it, though obviously this means that you waste bandwidth.