C# - Checking several IP adresses with several download methods - c#

I am new to this site so I apologise in advance if I made any formatting errors. Let me know if further clarification is needed.
I am currently working on a program in C# which aims at downloading files from several http or ftp sources.
I want it to go through several IP addresses and then check if several methods could download a known file successful. If one of these methods could download the file successfully it should go to the next IP and do the same thing again.
So far I am using a foreach loop that runs through an array containing the IP and creating a folder named "IPxx" and a FTP and HTTP URI because I do not know in advance which IP needs a FTP or HTTP address:
string[] ipArray = new string[4];
ipArray[0]= "xxx.xxx.xxx.xx";
ipArray[1]= "xxx.xxx.xxx.xx";
ipArray[2]= "xxx.xxx.xxx.xx";
ipArray[3]= "xxx.xxx.xxx.xx";
foreach(string ip in ipArray )
{
string ipXx = "IP" + ip.Substring(ip.Length-2);
string ipOrdner = folder + #"\" + ipXx;
Directory.CreateDirectory(ipOrdner);
string ftpAddr= "ftp://" + ip;
string httpAddr= "http://"+ip;
//DownloadTestMethod(httpAddr, ipFolder);
//DownloadTestMethod2(ftpAddr, ipFolder);
//DownloadTestMethod3(htppAddr, ipFolder);
}
So far so good, everything up to this level is working as expected.
However I struggle as soon as I need to go through several Download Methods and check if I can download the file successfully and if not it should go to the next DownloadMethod and try the same.
I came up with following DownloadTestMethod:
public static void DownloadTestMethod(string httpAddr, string ipFolder)
{
string fileName = "test_10k.bin";
string downloadpath = httpAddr + "/" + fileName;
// I want it to check if the http site is online/working
WebRequest request = WebRequest.Create(downloadpath);
request.Proxy.Credentials = CredentialCache.DefaultCredentials;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
if (response != null)
{
WebClient webClient = new WebClient();
webClient.Proxy.Credentials=CredentialCache.DefaultCredentials;
webClient.DownloadFile(downloadpath, ipFolder + #"\" + fileName);
}
}
It is working for a successfully downloaded file, however I seem to get "stuck" in this method if the method cannot downloaded the requested file.
In this case I want it to jump to the next DownloadMethod and check again until all methods are checked.
Furthermore, if the program went through every IP and it could not download anytrhing, I want it to send an automated email. I know how to implement the EmailMethod but once again I struggle with flow control.
I am an absolute beginner and have no idea how to go from here to get my desired flow.

You might want to look at Task Parallel Library and in particular, its WhenAll() method.
Under it, you might want to look at error handling for the case that any of your methods return error / raise exception.
https://msdn.microsoft.com/en-us/library/dd270695(v=vs.110).aspx

Related

C# Download an image from URL

I use a website to get stats on wifi usage. The website creates an image of a graph representation of the data. The way it does this is by the user, setting a date. So for example, lets say it was last months statistics. The website generates a URL which is then sent to the server and the server returns an image. an examples of the link is like this:
https://www.example.com/graph/daily_usage?time_from=2015-06-01+00%3A00%3A00+%2B0100&time_to=2015-06-30+23%3A59%3A59+%2B0100&tsp=1436519988
The problem is, I am making a third party program that will download this image to be used in my program. However, I cannot use the file. It is as if the file is corrupt or something. I have tried a few methods but maybe someone can suggest a different approach. Basically, how do I download an image that is generated by a server from a URL link?
P.S.
Just noticed that if I download the file by right clicking through a browser and save, the image downloads with a size of 17.something kilobytes. But if I use the WebClient method to download the image, it only downloads 1.5kb. Why would that be? Seems like the WebClient method does not download completely.
Currently my code
if (hrefAtt == "Usage Graph")
{
string url = element.getAttribute("src");
WebClient client = new WebClient();
client.DownloadFile(url, tempFolderPath + "\\" + currentAcc + "_UsageSummary.png");
wd.AddImagesToDoc(tempFolderPath + "\\" + currentAcc + "_UsageSummary.png");
wd.SaveDocument();
}
TempFolderPath is my desktop\TempFolder\
UPDATE
Out of random, I decided to see the raw data of the file with notepad and interestingly, the image data was actually a copy of the websites homepage html code, not the raw data of the image :S how does that make sense?
This will download the Google logo:
var img = Bitmap.FromStream(new MemoryStream(new WebClient().DownloadData("https://www.google.co.uk/images/srpr/logo11w.png")));
First of all, you have to understand link texture. If all links are same or close to each other, you have to use substring/remove/datetime etc. methods to make your new request link. For example;
string today = DateTime.Now.ToShortDateString();
string generatedLink = #"http://www.yoururl.com/image + " + today + ".jpg";
string generatedFileName = #"C:\Usage\usage + " + today + ".jpg";
WebClient wClient = new WebClient();
wClient.DownloadFile(generatedLink, generatedFileName);

Siebel COM Data Control File Transfer

Get the attachment from Siebel using COM Data Control.
SiebelBusObjectInterfaces.SiebelDataControl sblDataControl = new SiebelBusObjectInterfaces.SiebelDataControl();
bool success = sblDataControl.Login("host=\"siebel.TCPIP.None.None://bla bla bla /EAIObjMgr_enu\"", "karephul", getPassword());
string errorCode = sblDataControl.GetLastErrCode() + " " + sblDataControl.GetLastErrText();
SiebelBusObjectInterfaces.SiebelBusObject oBO;
SiebelBusObjectInterfaces.SiebelBusComp serviceRequest;
SiebelBusObjectInterfaces.SiebelBusComp actionAttachment;
oBO = sblDataControl.GetBusObject("Action");
actionAttachment = oBO.GetBusComp("Action Attachment");
success = actionAttachment.ActivateField("Activity Id");
success = actionAttachment.ActivateField("ActivityFileName");
success = actionAttachment.ClearToQuery();
success = actionAttachment.SetSearchSpec("Activity Id", "3-QOUKDD"); // hard code for now.
success = actionAttachment.SetSearchSpec("ActivityFileExt", "txt");
success = actionAttachment.ExecuteQuery(1); // ForwardOnly = 1, I guess;
if (actionAttachment.FirstRecord())
{
string fileName = actionAttachment.GetFieldValue("ActivityFileName");
string fileLoc = actionAttachment.InvokeMethod("GetFile", "ActivityFileName");
}
This below piece of code gets the appropriate file and keep it in temp folder of the server and gives me fully qualified path.
string fileLoc = actionAttachment.InvokeMethod("GetFile", "ActivityFileName");
Is there a way I can get the file to my local machine ?
Context:
This code is written in C# and we run this code on client side which does not have access to temp directory of the server.
Thanks
Karephul
Since your are connecting via datacontrol to http, this is equivalent to connecting to the thin client. If you connect via a dedicated client, you can save the file directly onto your system.
The above 3 solutions would work, but I suggest you can send the file over by email, if attachment size is not a problem.
After talking to some guys working on Siebel, I found that with COM API's I cannot get the file to the local machine.
Options:
1. Make the temp folder public and get the file.
2. Ask your Siebel team to expose a webservice to get the file.
3. Ask your Siebel team to provide REST type link to download file.

User Permissions issue

I am using Visual Studio C# to parse an XML document for a file location from a local search tool I am using. Specifically I am using c# to query if the user has access to certain files and hide those to which it does not have access. I seem to have files that should return access is true however because not all files are local (IE some are web files without proper names) it is not showing access to files it should be showing access to. The error right now is caused by a url using .aspx?i=573, is there a work around or am I going to have to just remove all of these files... =/
Edit: More info...
I am using right now....
foreach (XmlNode xn in nodeList)
{
string url = xn.InnerText;
//Label1.Text = url;
try
{ using (FileStream fs = File.OpenRead(url)) { }
}
catch { i++; Label2.Text = i.ToString(); Label1.Text = url; }
}
The issue is, when it attempts to open files like the ....aspx?i=573 it puts them in the catch stack. If I attempt to open the file however the file opens just fine. (IE I have read access but because of either the file type or the append of the '?=' in the file name it tosses it into the unreadable stack.
I want everything that is readable either via url or local access to display else it will catch the error files for me.
I'm not sure exactly what you are trying to do, but if you only want the path of a URI, you can easily drop the query string portion like this:
Uri baseUri = new Uri("http://www.domain.com/");
Uri myUri = new Uri(baseUri, "home/default.aspx?i=573");
Console.WriteLine(myUri.AbsolutePath); // ie "home/default.aspx"
You cannot have ? in file names in Windows, but they are valid in URIs (that is why IE can open it, but Windows cannot).
Alternatively, you could just replace the '?' with some other character if you are converting a URL to a filename.
In fact thinking about it now, you could just check to see if your "document" was a URI or not, and if it isn't then try to open the file on the file system. Sounds like you are trying to open any and everything that is supplied, but it wouldn't hurt to performs some checks on the data.
private static bool IsLocalPath(string p)
{
return new Uri(p).IsFile;
}
This is from Check if the path input is URL or Local File it looks like exactly what you are looking for.
FileStream reads and writes local files. "?" is not valid character for local file name.
It looks like you want to open local and remote files. If it is what you are trying to do you should use approapriate metod of downloading for each type - i.e. for HTTP you WebRequest or related classes.
Note: it would be much easier to answer if you'd say: when url is "..." File.OpenRead(url) failes with exception, mesasge "...".

Help Needed for parsing FTP files list in c#

I am using this code for getting list of all the files in directory
here webRequestUrl = something.com/directory/
FtpWebRequest fwrr = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://" + webRequestUrl));
fwrr.Credentials = new NetworkCredential(username, password);
fwrr.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
StreamReader srr = new StreamReader(fwrr.GetResponse().GetResponseStream());
string str = srr.ReadLine();
ArrayList strList = new ArrayList();
while (str != null)
{
strList.Add(str);
str = srr.ReadLine();
}
but I am not getting the list of files, but getting some HTML document type lines.
This ftp server is windows based while it is working fine in unix server.
Please help.
Thanks.
It works for me when the FTP on a internal machine and I do a ftp://192.168.0.155 - If I try that in IE I get the same HTML result like yours.
I doubt if its happening because of the url. Can you try replacing the url with the IP address (just a wild guess). Even if you are getting HTML, you can strip the unnecessary part and parse the files.
I even tried with a ftp://sub.a.com/somefolder and it worked for me. It seems the browser wraps the HTML around the FTP response because I get different HTML when I opened the FTP site in IE and Chrome.

Downloading all files using FTP and C#

What is the best way to download all files in a remote directory using C# and FTP and save them to a local directory?
Thanks.
downloading all files in a specific folder seems to be an easy task. However, there are some issues which has to be solved. To name a few:
How to get list of files (System.Net.FtpWebRequest gives you unparsed list and directory list format is not standardized in any RFC)
What if remote directory has both files and subdirectories. Do we have to dive into the subdirs and download it's content?
What if some of the remote files already exist on the local computer? Should they be overwritten? Skipped? Should we overwrite older files only?
What if the local file is not writable? Should the whole transfer fail? Should we skip the file and continue to the next?
How to handle files on a remote disk which are unreadable because we don’t have sufficient access rights?
How are the symlinks, hard links and junction points handled? Links can easily be used to create an infinite recursive directory tree structure. Consider folder A with subfolder B which in fact is not the real folder but the *nix hard link pointing back to folder A. The naive approach will end in an application which never ends (at least if nobody manage to pull the plug).
Decent third party FTP component should have a method for handling those issues. Following code uses our Rebex FTP for .NET.
using (Ftp client = new Ftp())
{
// connect and login to the FTP site
client.Connect("mirror.aarnet.edu.au");
client.Login("anonymous", "my#password");
// download all files
client.GetFiles(
"/pub/fedora/linux/development/i386/os/EFI/*",
"c:\\temp\\download",
FtpBatchTransferOptions.Recursive,
FtpActionOnExistingFiles.OverwriteAll
);
client.Disconnect();
}
The code is taken from my blogpost available at blog.rebex.net. The blogpost also references a sample which shows how ask the user how to handle each problem (e.g. Overwrite/Overwrite older/Skip/Skip all).
Using C# FtpWebRequest and FtpWebReponse, you can use the following recursion (make sure the folder strings terminate in '\'):
public void GetAllDirectoriesAndFiles(string getFolder, string putFolder)
{
List<string> dirIitems = DirectoryListing(getFolder);
foreach (var item in dirIitems)
{
if ( item.Contains('.') )
{
GetFile(getFolder + item, putFolder + item);
}
else
{
var subDirPut = new DirectoryInfo(putFolder + "\\" + item);
subDirPut.Create();
GetAllDirectoriesAndFiles(getFolder + item + "\\", subDirPut.FullName + "\\");
}
}
}
The "item.Contains('.')" is a bit primitive, but has worked for my purposes. Post a comment if you need an example of the methods:
GetFile(string getFileAndPath, string putFileAndPath)
or
DirectoryListing(getFolder)
For FTP protocol you can use FtpWebRequest class from .NET framework. Though it does not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the FtpWebRequest. The FtpWebRequest unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it's a directory. But that can become a performance problem, when you have a large number of entries.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
void DownloadFtpDirectory(
string url, NetworkCredential credentials, string localPath)
{
FtpWebRequest listRequest = (FtpWebRequest)WebRequest.Create(url);
listRequest.UsePassive = true;
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (WebResponse listResponse = listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
lines.Add(listReader.ReadLine());
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
string localFilePath = Path.Combine(localPath, name);
string fileUrl = url + name;
if (permissions[0] == 'd')
{
Directory.CreateDirectory(localFilePath);
DownloadFtpDirectory(fileUrl + "/", credentials, localFilePath);
}
else
{
var downloadRequest = (FtpWebRequest)WebRequest.Create(fileUrl);
downloadRequest.UsePassive = true;
downloadRequest.UseBinary = true;
downloadRequest.Method = WebRequestMethods.Ftp.DownloadFile;
downloadRequest.Credentials = credentials;
var response = downloadRequest.GetResponse();
using (Stream ftpStream = response.GetResponseStream())
using (Stream fileStream = File.Create(localFilePath))
{
ftpStream.CopyTo(fileStream);
}
}
}
}
The url must be like:
ftp://example.com/ or
ftp://example.com/path/
Or use 3rd party library that supports recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "mypassword",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Download files
session.GetFiles("/home/user/*", #"d:\download\").Check();
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)
You could use System.Net.WebClient.DownloadFile(), which supports FTP. MSDN Details here
You can use FTPClient from laedit.net. It's under Apache license and easy to use.
It use FtpWebRequest :
first you need to use WebRequestMethods.Ftp.ListDirectoryDetails to get the detail of all the list of the folder
for each files you need to use WebRequestMethods.Ftp.DownloadFile to download it to a local folder

Categories