Help Needed for parsing FTP files list in c# - c#

I am using this code for getting list of all the files in directory
here webRequestUrl = something.com/directory/
FtpWebRequest fwrr = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://" + webRequestUrl));
fwrr.Credentials = new NetworkCredential(username, password);
fwrr.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
StreamReader srr = new StreamReader(fwrr.GetResponse().GetResponseStream());
string str = srr.ReadLine();
ArrayList strList = new ArrayList();
while (str != null)
{
strList.Add(str);
str = srr.ReadLine();
}
but I am not getting the list of files, but getting some HTML document type lines.
This ftp server is windows based while it is working fine in unix server.
Please help.
Thanks.

It works for me when the FTP on a internal machine and I do a ftp://192.168.0.155 - If I try that in IE I get the same HTML result like yours.
I doubt if its happening because of the url. Can you try replacing the url with the IP address (just a wild guess). Even if you are getting HTML, you can strip the unnecessary part and parse the files.
I even tried with a ftp://sub.a.com/somefolder and it worked for me. It seems the browser wraps the HTML around the FTP response because I get different HTML when I opened the FTP site in IE and Chrome.

Related

C# WebClient can not upload large files

I have a large file (500MB) and my application can not send it to my webpage.
public static void UploadFile(string url, string fileName) {
WebClient client = new WebClient();
byte[] responseBinary = client.UploadFile(url, fileName);
string response = Encoding.UTF8.GetString(responseBinary);
MessageBox.Show(response);
}
It sends to a php file. I tried a check
var_dump($_FILES);
var_dump($_REQUEST);
but both of them is empty.
When I try to upload a small file (8KB) it is in $_FILES['file'].
What am I doing wrong?
You have to modify your server's php.ini file to allow big files to be sent to it, those are the two directives you have to modify:
upload_max_filesize = 900M
post_max_size = 900M
Also, remember to restart apache or nginx or whatever is your server after you edited the php.ini file.
EDIT
If you don't have access to this kind of configuration on your host, you could try (it may, or may not be allowed) modifing this settings at .htaccess file (on the root of your site) adding the following to it:
php_value upload_max_filesize 900M
php_value post_max_size 900M

C# - Checking several IP adresses with several download methods

I am new to this site so I apologise in advance if I made any formatting errors. Let me know if further clarification is needed.
I am currently working on a program in C# which aims at downloading files from several http or ftp sources.
I want it to go through several IP addresses and then check if several methods could download a known file successful. If one of these methods could download the file successfully it should go to the next IP and do the same thing again.
So far I am using a foreach loop that runs through an array containing the IP and creating a folder named "IPxx" and a FTP and HTTP URI because I do not know in advance which IP needs a FTP or HTTP address:
string[] ipArray = new string[4];
ipArray[0]= "xxx.xxx.xxx.xx";
ipArray[1]= "xxx.xxx.xxx.xx";
ipArray[2]= "xxx.xxx.xxx.xx";
ipArray[3]= "xxx.xxx.xxx.xx";
foreach(string ip in ipArray )
{
string ipXx = "IP" + ip.Substring(ip.Length-2);
string ipOrdner = folder + #"\" + ipXx;
Directory.CreateDirectory(ipOrdner);
string ftpAddr= "ftp://" + ip;
string httpAddr= "http://"+ip;
//DownloadTestMethod(httpAddr, ipFolder);
//DownloadTestMethod2(ftpAddr, ipFolder);
//DownloadTestMethod3(htppAddr, ipFolder);
}
So far so good, everything up to this level is working as expected.
However I struggle as soon as I need to go through several Download Methods and check if I can download the file successfully and if not it should go to the next DownloadMethod and try the same.
I came up with following DownloadTestMethod:
public static void DownloadTestMethod(string httpAddr, string ipFolder)
{
string fileName = "test_10k.bin";
string downloadpath = httpAddr + "/" + fileName;
// I want it to check if the http site is online/working
WebRequest request = WebRequest.Create(downloadpath);
request.Proxy.Credentials = CredentialCache.DefaultCredentials;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response != null)
{
WebClient webClient = new WebClient();
webClient.Proxy.Credentials=CredentialCache.DefaultCredentials;
webClient.DownloadFile(downloadpath, ipFolder + #"\" + fileName);
}
}
It is working for a successfully downloaded file, however I seem to get "stuck" in this method if the method cannot downloaded the requested file.
In this case I want it to jump to the next DownloadMethod and check again until all methods are checked.
Furthermore, if the program went through every IP and it could not download anytrhing, I want it to send an automated email. I know how to implement the EmailMethod but once again I struggle with flow control.
I am an absolute beginner and have no idea how to go from here to get my desired flow.
You might want to look at Task Parallel Library and in particular, its WhenAll() method.
Under it, you might want to look at error handling for the case that any of your methods return error / raise exception.
https://msdn.microsoft.com/en-us/library/dd270695(v=vs.110).aspx

ftpWebReponse (ListDirectory) FTP approved log in process, but not returning folder data

I've been experimenting with using Visual C# to connect to a remote FTP server, and list the folders on the server. Ultimately, I'm trying to figure out how to automatically upload a file to this server with the click of a button. I figured my first step would be to see how the folders are structured on the remote FTP server. However, when I try to view the results passed in debug, it appears to be empty, although I didn't receive any errors from the process, and the parameters in debug make it look like I was successfully logged in. I also received the file transfer complete message 226 in the response data. Here's the latest code I have tried, although I have tried various things in the FtpWebRequest function. If I turn off UsePassive, it sits there forever and I never get a response back from the server. Does anyone have any ideas on a setting that might be causing this or some parameter that might need to be set?
private void button4_Click(object sender, EventArgs e)
{
FtpWebRequest ftpRequest = (FtpWebRequest)WebRequest.Create("ftp://sampleftpplace.com/");
ftpRequest.UseBinary = true;
ftpRequest.Credentials = new NetworkCredential("username", "password");
ftpRequest.EnableSsl = true;
ftpRequest.Method = WebRequestMethods.Ftp.ListDirectory;
ftpRequest.Proxy = null;
ftpRequest.Timeout = -1;
ftpRequest.KeepAlive = false;
//ftpRequest.UsePassive = false;
FtpWebResponse response = (FtpWebResponse)ftpRequest.GetResponse();
StreamReader streamReader = new StreamReader(response.GetResponseStream());
List<string> directories = new List<string>();
string line = streamReader.ReadLine();
while (!string.IsNullOrEmpty(line))
{
directories.Add(line);
line = streamReader.ReadLine();
}
streamReader.Close();
}
When the "string line = streamReader.Readline();" is executed, the variable "line" is null. All I am really after is to see what folders are available so I can see how the original programmer of the FTP server structured their folders, and what naming convention they used. Is this possible if that programmer isn't available any more?
I just had a crazy thought, and it turned out that it worked! Basically, I changed the first couple of lines to read like this:
Uri uri = new Uri("ftp://sampleftpplace.com//");
FtpWebRequest ftpRequest = (FtpWebRequest)WebRequest.Create(uri);
First, I used the command Uri. Not sure if that matters or not, but what really mattered was the 2nd "/" in the URL address. I think the server didn't see it as the root directory until I added this. Once I put that in, I'm seeing the folders now. Thanks!

Can't connect to FTP: (553) File name not allowed

I need to FTP a file to a directory. In .Net I have to use a file on the destination folder to create a connection so I manually put Blank.dat on the server using FTP. I checked the access (ls -l) and it is -rw-r--r--. But when I attempt to connect to the FTP folder I get: "The remote server returned an error: (553) File name not allowed" back from the server. The research I have done says that this may arrise from a permissions issue but as I have said I have permissions to view the file and can run ls from the folder. What other reasons could cause this issue and is there a way to connect to the folder without having to specify a file?
byte[] buffer;
Stream reqStream;
FileStream stream;
FtpWebResponse response;
FtpWebRequest request = (FtpWebRequest)FtpWebRequest.Create(new Uri(string.Format("ftp://{0}/{1}", SRV, DIR)));
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(UID, PASS);
request.UseBinary = true;
request.Timeout = 60000 * 2;
for (int fl = 0; fl < files.Length; fl++)
{
request.KeepAlive = (files.Length != fl);
stream = File.OpenRead(Path.Combine(dir, files[fl]));
reqStream = request.GetRequestStream();
buffer = new byte[4096 * 2];
int nRead = 0;
while ((nRead = stream.Read(buffer, 0, buffer.Length)) != 0)
{
reqStream.Write(buffer, 0, nRead);
}
stream.Close();
reqStream.Close();
response = (FtpWebResponse)request.GetResponse();
response.Close();
}
Although replying to an old post just thought it might help someone.
When you create your ftp url make sure you are not including the default directory for that login.
for example this was the path which I was specifying and i was getting the exception 553 FileName not allowed exception
ftp://myftpip/gold/central_p2/inbound/article_list/jobs/abc.txt
The login which i used had the default directory gold/central_p2.so the supplied url became invalid as it was trying to locate the whole path in the default directory.I amended my url string accordingly and was able to get rid of the exception.
my amended url looked like
ftp://myftpip/inbound/article_list/jobs/abc.txt
Thanks,
Sab
This may help for Linux FTP server.
So, Linux FTP servers unlike IIS don't have common FTP root directory. Instead, when you log on to FTP server under some user's credentials, this user's root directory is used. So FTP directory hierarchy starts from /root/ for root user and from /home/username for others.
So, if you need to query a file not relative to user account home directory, but relative to file system root, add an extra / after server name. Resulting URL will look like:
ftp://servername.net//var/lalala
You must be careful with names and paths:
string FTP_Server = #"ftp://ftp.computersoft.com//JohnSmith/";
string myFile="text1.txt";
string myDir=#"D:/Texts/Temp/";
if you are sending to ftp.computersoft.com/JohnSmith a file caled text1.txt located at d:/texts/temp
then
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(FTP_Server+myFile);
request.Method = WebRequestMethods.Ftp.UploadFile;
request.Credentials = new NetworkCredential(FTP_User, FTP_Password);
StreamReader sourceStream = new StreamReader(TempDir+myFile);
byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd());
sourceStream.Close();
Stream requestStream = request.GetRequestStream();
requestStream.Write(fileContents, 0, fileContents.Length);
requestStream.Close();
notice that at one moment you use as destination
ftp://ftp.computersoft.com//JohnSmith/text1.txt
which contains not only directory but the new file name at FTP server as well (which in general can be different than the name of file on you hard drive)
and at other place you use as source
D:/Texts/Temp/text1.txt
your directory has access limit.
delete your directory and then create again with this code:
//create folder
FtpWebRequest request = (FtpWebRequest)FtpWebRequest.Create("ftp://mpy1.vvs.ir/Subs/sub2");
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredential(username, password);
request.UsePassive = true;
request.UseBinary = true;
request.KeepAlive = true ;
using (var resp = (FtpWebResponse)request.GetResponse())
{
}
I saw something similar to this a while back, it turned out to be the fact that I was trying to connect to an internal iis ftp server that was secured using Active Directory.
In my network credentials I was using new NetworkCredential(#"domain\user", "password"); and that was failing. Switching to new NetworkCredential("user", "password", "domain"); worked for me.
I hope this will be helpful for someone
if you are using LINUX server, replace your request path from
FtpWebRequest req= (FtpWebRequest)WebRequest.Create(#"ftp://yourdomain.com//yourpath/" + filename);
to
FtpWebRequest req= (FtpWebRequest)WebRequest.Create(#"ftp://yourdomain.com//public_html/folderpath/" + filename);
the folder path is how you see in the server(ex: cpanel)
Although it's an older post I thought it would be good to share my experience. I came faced the same problem however I solved it myself. The main 2 reasons of this problem is Error in path (if your permissions are correct) happens when your ftp path is wrong. Without seeing your path it is impossible to say what's wrong but one must remember the things
a. Unlike browser FTP doesn't accept some special characters like ~
b. If you have several user accounts under same IP, do not include username or the word "home" in path
c. Don't forget to include "public_html" in the path (normally you need to access the contents of public_html only) otherwise you may end up in a bottomless pit
Another reason for this error could be that the FTP server is case sensitive. That took me good while to figure out.
I had this problem when I tried to write to the FTP site's root directory pro grammatically. When I repeated the operation manually, the FTP automatically rerouted me to a sub-directory and completed the write operation. I then changed my code to prefix the target filename with the sub-directory and the operation was successful.
Mine was as simple as a file name collision. A previous file hadn't been sent to an archive folder so we tried to send it again. Got the 553 because it wouldn't overwrite the existing file.
Check disk space on the remote server first.
I had the same issue and found out it was because the remote server i was attempting to upload files to had run out of disk space on the partition or filessytem.
To whom it concerns...
I've been stuck for a long time with this problem.
I tried to upload a file onto my web server like that:
Say that my domain is www.mydomain.com.
I wanted to upload to a subdomain which is order.mydomain.com, so I've used:
FtpWebRequest ftpReq = (FtpWebRequest)WebRequest.Create($#"ftp://order.mydomain.com//uploads/{FileName}");
After many tries and getting this 553 error, I found out that I must make the FTP request refer to the main domain not the sub domain and to include the subdomain as a subfolder (which is normally created when creating subdomains).
As I've created my subdomain subfolder out of the public_html (at the root), so I've changed the FTP Request to:
FtpWebRequest ftpReq = (FtpWebRequest)WebRequest.Create($#"ftp://www.mydomain.com//order.mydomain.com//uploads/{FileName}");
And it finally worked.

Downloading all files using FTP and C#

What is the best way to download all files in a remote directory using C# and FTP and save them to a local directory?
Thanks.
downloading all files in a specific folder seems to be an easy task. However, there are some issues which has to be solved. To name a few:
How to get list of files (System.Net.FtpWebRequest gives you unparsed list and directory list format is not standardized in any RFC)
What if remote directory has both files and subdirectories. Do we have to dive into the subdirs and download it's content?
What if some of the remote files already exist on the local computer? Should they be overwritten? Skipped? Should we overwrite older files only?
What if the local file is not writable? Should the whole transfer fail? Should we skip the file and continue to the next?
How to handle files on a remote disk which are unreadable because we don’t have sufficient access rights?
How are the symlinks, hard links and junction points handled? Links can easily be used to create an infinite recursive directory tree structure. Consider folder A with subfolder B which in fact is not the real folder but the *nix hard link pointing back to folder A. The naive approach will end in an application which never ends (at least if nobody manage to pull the plug).
Decent third party FTP component should have a method for handling those issues. Following code uses our Rebex FTP for .NET.
using (Ftp client = new Ftp())
{
// connect and login to the FTP site
client.Connect("mirror.aarnet.edu.au");
client.Login("anonymous", "my#password");
// download all files
client.GetFiles(
"/pub/fedora/linux/development/i386/os/EFI/*",
"c:\\temp\\download",
FtpBatchTransferOptions.Recursive,
FtpActionOnExistingFiles.OverwriteAll
);
client.Disconnect();
}
The code is taken from my blogpost available at blog.rebex.net. The blogpost also references a sample which shows how ask the user how to handle each problem (e.g. Overwrite/Overwrite older/Skip/Skip all).
Using C# FtpWebRequest and FtpWebReponse, you can use the following recursion (make sure the folder strings terminate in '\'):
public void GetAllDirectoriesAndFiles(string getFolder, string putFolder)
{
List<string> dirIitems = DirectoryListing(getFolder);
foreach (var item in dirIitems)
{
if ( item.Contains('.') )
{
GetFile(getFolder + item, putFolder + item);
}
else
{
var subDirPut = new DirectoryInfo(putFolder + "\\" + item);
subDirPut.Create();
GetAllDirectoriesAndFiles(getFolder + item + "\\", subDirPut.FullName + "\\");
}
}
}
The "item.Contains('.')" is a bit primitive, but has worked for my purposes. Post a comment if you need an example of the methods:
GetFile(string getFileAndPath, string putFileAndPath)
or
DirectoryListing(getFolder)
For FTP protocol you can use FtpWebRequest class from .NET framework. Though it does not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the FtpWebRequest. The FtpWebRequest unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it's a directory. But that can become a performance problem, when you have a large number of entries.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
void DownloadFtpDirectory(
string url, NetworkCredential credentials, string localPath)
{
FtpWebRequest listRequest = (FtpWebRequest)WebRequest.Create(url);
listRequest.UsePassive = true;
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (WebResponse listResponse = listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
lines.Add(listReader.ReadLine());
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
string localFilePath = Path.Combine(localPath, name);
string fileUrl = url + name;
if (permissions[0] == 'd')
{
Directory.CreateDirectory(localFilePath);
DownloadFtpDirectory(fileUrl + "/", credentials, localFilePath);
}
else
{
var downloadRequest = (FtpWebRequest)WebRequest.Create(fileUrl);
downloadRequest.UsePassive = true;
downloadRequest.UseBinary = true;
downloadRequest.Method = WebRequestMethods.Ftp.DownloadFile;
downloadRequest.Credentials = credentials;
var response = downloadRequest.GetResponse();
using (Stream ftpStream = response.GetResponseStream())
using (Stream fileStream = File.Create(localFilePath))
{
ftpStream.CopyTo(fileStream);
}
}
}
}
The url must be like:
ftp://example.com/ or
ftp://example.com/path/
Or use 3rd party library that supports recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "mypassword",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Download files
session.GetFiles("/home/user/*", #"d:\download\").Check();
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)
You could use System.Net.WebClient.DownloadFile(), which supports FTP. MSDN Details here
You can use FTPClient from laedit.net. It's under Apache license and easy to use.
It use FtpWebRequest :
first you need to use WebRequestMethods.Ftp.ListDirectoryDetails to get the detail of all the list of the folder
for each files you need to use WebRequestMethods.Ftp.DownloadFile to download it to a local folder

Categories