copy multiple remote files to another remote file via HttpWebRequest C# - c#

I have a remote folder with n files and I need to copy content in another remote file. I guess it can be done through streams, and this is what I tried:
WebRequest destRequest = WebRequest.Create(destFile);
destRequest.Method = "PUT";
destRequest.Headers.Add("x-ms-blob-type", "BlockBlob"); //just an example with Azure blob, doesn't matter
using (Stream destStream = destRequest.GetRequestStream())
{
string sourceName = "mysourcefolder";
int blockSize = 8388608; //all the files have the same lenght, except one (sometimes)
for (int i = 0; i < n; i++)
{
string source = sourceName + i;
WebRequest sourceRequest = WebRequest.Create(source);
destRequest.Method = "GET";
HttpWebResponse destResp = (HttpWebResponse)destRequest.GetResponse();
using (Stream sourceStream = destResp.GetResponseStream())
{
sourceStream.CopyTo(destStream, blockSize);
}
}
Console.Write("ok");
}
}
catch (Exception e)
{
Console.Write("nope !");
}
There are multiple issues in my code:
1) I have to specify the lenght in my PUT request. Probably it is blockSize*n since I have no exceptions about this;
2) If that is the case, I have still the exception Cannot close stream until all bytes are written. What does it means?

There has been confusion in resource and dest requests.
I have added comments to the changing lines.
WebRequest destRequest = WebRequest.Create(destFile);
destRequest.Method = "PUT";
destRequest.Headers.Add("x-ms-blob-type", "BlockBlob"); //just an example with Azure blob, doesn't matter
using (Stream destStream = destRequest.GetRequestStream())
{
string sourceName = "mysourcefolder";
//int blockSize = 8388608; //all the files have the same lenght, except one (sometimes) //all the files have the same lenght, except one (sometimes)
for (int i = 0; i < n; i++)
{
string source = sourceName + i;
WebRequest sourceRequest = WebRequest.Create(source);
destRequest.Method = "GET";
//HttpWebResponse destResp = (HttpWebResponse)destRequest.GetResponse();
//using (Stream sourceStream = destResp.GetResponseStream())
// you need source response
HttpWebResponse sourceResp = (HttpWebResponse)sourceRequest.GetResponse();
using (Stream sourceStream = sourceResp.GetResponseStream())
{
sourceStream.CopyTo(destStream);
}
}
// The request is made here
var destinationResponse = (HttpWebResponse) destRequest.GetResponse();
//Console.Write("ok");
Console.Write(destinationResponse.StatusCode.ToString());
}

Related

Download a URL image to a client PC via C#

I'm trying to access a file on the server via an api that is behind Basic Auth. I then want to download that to a client's PC.
I've got the following code which does GET the url from behind the basic auth, however the image never downloads properly. I either get a failed network error message or I get a message saying I can't download it because my machine doesn't have an app installed to open it. It's a png so it definitely does!
It goes the whole way through the code and doesn't error so I'm confused as to why it's not downloading correctly to the clients machine (my pc while I'm testing!)
In the code I am specifying one file and I have specified it's length as bytes just to try and narrow down where I'm going wrong. Normally this could be any file that's being access of any length!
This is the code I have:
//Create a stream for the file
Stream stream = null;
var size = fileResp.ContentLength; //I used this to determine the file was 64196 in size
//This controls how many bytes to read at a time and send to the client
int bytesToRead = 64196;
// Buffer to read bytes in chunk size specified above
byte[] buffer = new Byte[bytesToRead];
string url= "https://myURL/images/image-2019-04-02-16-25-18-458.png";
WebRequest myReq = WebRequest.Create(url);
string credentials = "username:pwd";
CredentialCache mycache = new CredentialCache();
myReq.Headers["Authorization"] = "Basic " + Convert.ToBase64String(Encoding.ASCII.GetBytes(credentials));
myReq.Method = "GET";
// The number of bytes read
try
{
//Create a response for this request
HttpWebResponse fileResp = (HttpWebResponse)myReq.GetResponse();
if (myReq.ContentLength > 0)
fileResp.ContentLength = myReq.ContentLength;
//Get the Stream returned from the response
stream = fileResp.GetResponseStream();
// prepare the response to the client. resp is the client Response
var resp = HttpContext.Current.Response;
//Indicate the type of data being sent
string contentType = MimeMapping.GetMimeMapping("new.png");
resp.ContentType = contentType;
string fileName = "new.png";
//Name the file
resp.AddHeader("Content-Disposition", "attachment; filename=\"" + fileName + "\"");
resp.AddHeader("Content-Length", fileResp.ContentLength.ToString());
int length;
do
{
// Verify that the client is connected.
if (resp.IsClientConnected)
{
// Read data into the buffer.
length = stream.Read(buffer, 0, bytesToRead);
// and write it out to the response's output stream
resp.OutputStream.Write(buffer, 0, length);
// Flush the data
resp.Flush();
//Clear the buffer
buffer = new Byte[bytesToRead];
}
else
{
// cancel the download if client has disconnected
length = -1;
}
} while (length > 0); //Repeat until no data is read
}
finally
{
if (stream != null)
{
//Close the input stream
stream.Close();
}
}
The output from here: fileResp.GetResponseStream();
At first, please try to test if the following code works on your computer.
private bool DownloadImage(string imgurl, string filename)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(imgurl);
HttpWebResponse response = null;
try
{
response = (HttpWebResponse)request.GetResponse();
}
catch
{
response = null;
return false;
}
if (response != null && response.StatusCode == HttpStatusCode.OK)
{
Stream receiveStream = response.GetResponseStream();
Bitmap bitmap = new Bitmap(receiveStream);
if (bitmap != null)
{
bitmap.Save(filename);
}
receiveStream.Flush();
receiveStream.Close();
response.Close();
return true;
}
else
{
return false;
}
}
private void button1_Click(object sender, EventArgs e)
{
string imgurl = "https://upload.wikimedia.org/wikipedia/commons/4/42/Cute-Ball-Go-icon.png";
string filename = "D:\\download_test.png";
bool bIsDownloadSuccess = DownloadImage(imgurl, filename);
}
This code works on me well.
It doesn't have error, but returns false.
Please check where false is returned.
If it has some error, the problem will be on Windows System.
Please try and let me know.
Thanks.

Bulk download by date modified - 505 File unavailable

I'm trying to create a script that download multiple text documents from an FTP server by date modified that are not more than a day old:
if (_datemodified > DateTime.Now.AddDays(-2));
But when I'm running my code it manages to download 200-300 files then times out with
" System.Net.WebException: The remote server returned an error: (550)
File unavailable (e.g., file not found, no access)."
Here's a snippet of the code that the error points to:
static void Checkdatemodified(string url, string savePath)
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(url);
request.Method = WebRequestMethods.Ftp.GetDateTimestamp;
request.Credentials = new NetworkCredential("user", "******");
request.UseBinary = true;
DateTime temp;
using (FtpWebResponse response = (FtpWebResponse)request.GetResponse())
{
temp = response.LastModified;
}
FtpWebRequest request2 = (FtpWebRequest)WebRequest.Create(url);
request2.Credentials = new NetworkCredential("user", "******");
request2.Method = WebRequestMethods.Ftp.DownloadFile;
request2.UseBinary = true;
if (temp > DateTime.Now.AddDays(-2))
{
using (FtpWebResponse response2 = (FtpWebResponse)request2.GetResponse())
{
// DateTime temp = response.LastModified;
using (Stream rs = response2.GetResponseStream())
{
using (FileStream ws = new FileStream(savePath, FileMode.Create))
{
byte[] buffer = new byte[2048];
int bytesRead = rs.Read(buffer, 0, buffer.Length);
while (bytesRead > 0)
{
ws.Write(buffer, 0, bytesRead);
bytesRead = rs.Read(buffer, 0, buffer.Length);
}
}
// DateTime temp = Checkdatemodified(url);
System.IO.File.SetLastWriteTime(savePath, temp);
}
}
}
And here's the code that calls the function
//Create new FTPClient in memory
using (var ftpClient = new FtpClient())
{
// Sets the location of the FTP folder we will use in the request
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://(ftp folder link)");
//Set the method it will use to list the items in the directory
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
// Set the credentials used to get onto the FTP
request.Credentials = new NetworkCredential("user", "******");
//Processes request
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
//Stream reader reads each character of a line
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
//Creates new list of strings and a new string variable
List<string> lines = new List<string>();
List<string> lines2 = new List<string>();
string line;
//For each line the streamreader reads
//Split the line into an array and take the last entry and add this to the list
while ((line = reader.ReadLine()) != null)
{
string[] temp = line.Split(null);
string temp2 = temp[temp.Length - 1];
lines.Add(temp2);
}
//Close everything
responseStream.Close();
reader.Close();
//For each each line in the list starting at number 2(the first 2 are just the directory names)
for (int i = 2; i < lines.Count - 1; i++)
{
//Set temp strings. 1 for the file path + file name and the other for the location to download to
string temp = "ftp://(ftp folder)" + lines[i];
string temp2 = #"\\(server folder location)" + lines[i];
//If the date modified in the FTP between now and 2 days old then download the fil
if (System.IO.File.Exists(temp2))
{
System.IO.File.Delete(temp2);
}
Checkdatemodified(temp, temp2);
//File.AppendAllText(#"C:\TestFolder\error.txt", value);
}
// DownloadFtpFile(temp, temp2);
}
}
I feel the issue is that I'm using 3 different FtpWebRequest per download for 300+ files which are
1) Get a list of directory
2) Find date modified
3) Download the files
Or it could be an issue with the list that holds the file names.
Does anyone know why this isn't working? Or if there's a better way of bulk downloading ftp files by datemodified
P.S - Code is a bit messy and needs to be cleaned up but I'm just trying to make it work first before hand

Monitoring upload progress with HttpWebRequest

I've recently written a C# function that does a multi part form post for uploading files. To track the progress, I'd write the form data to the request stream at 4096 bytes at a time and call back with each write. However, it seems that the request does not even get sent until GetResponseAsync() is called.
If this is the case, is the reporting of every 4096 bytes written to the request stream an accurate reporting of upload progress?
If not, how can I accurately report progress? WebClient is out of the question for me, this is in a PCL Xamarin project.
private async Task<string> PostFormAsync (string postUrl, string contentType, byte[] formData)
{
try {
HttpWebRequest request = WebRequest.Create (postUrl) as HttpWebRequest;
request.Method = "POST";
request.ContentType = contentType;
request.Headers ["Cookie"] = Constants.Cookie;
byte[] buffer = new byte[4096];
int count = 0;
int length = 0;
using (Stream requestStream = await request.GetRequestStreamAsync ()) {
using (Stream inputStream = new MemoryStream (formData)) {
while ((count = await inputStream.ReadAsync (buffer, 0, buffer.Length)) > 0) {
await requestStream.WriteAsync (buffer, 0, count);
length += count;
Device.BeginInvokeOnMainThread (() => {
_progressBar.Progress = length / formData.Length;
});
}
}
}
_progressBar.Progress = 0;
WebResponse resp = await request.GetResponseAsync ();
using (Stream stream = resp.GetResponseStream ()) {
StreamReader respReader = new StreamReader (stream);
return respReader.ReadToEnd ();
}
} catch (Exception e) {
Debug.WriteLine (e.ToString ());
return String.Empty;
}
}
Please note that I am asking about monitoring progress of an upload at 4096 bytes at a time, not a download
I ended up accomplishing this by setting the AllowWriteStreamBuffering property of the WebRequest equal to false and the SendChunked property to true.
HOWEVER Xamarin.PCL (Profile 78) does not allow you to access these properties of the HttpWebRequest, so I had to instantiate my HttpWebRequest and return it from a dependency service in my platform specific project (only tested in iOS).
public class WebDependency : IWebDependency
{
public HttpWebRequest GetWebRequest(string uri)
{
var request = WebRequest.Create (uri) as HttpWebRequest;
request.SendChunked = true;
request.AllowWriteStreamBuffering = false;
return request;
}
}
And then to instantiate my web request -
HttpWebRequest request = DependencyService.Get<IWebDependency>().GetWebRequest(uri);

Progress in uploading in ftp server c#

I ve got the following code which
foreach (string dir in dirs) { //dirs all files in directory
try{
// Get an instance of WebClient
WebClient client = new System.Net.WebClient();
// parse the ftp host and file into a uri path for the upload
Uri uri = new Uri(m_FtpHost + new FileInfo(dir).Name);
// set the username and password for the FTP server
client.Credentials = new System.Net.NetworkCredential(m_FtpUsername, m_FtpPassword);
// upload the file asynchronously, non-blocking.
client.UploadFileAsync(uri, "STOR",dir);
}
catch(Exception e){
print(e.Message);
}
}
Can I retrieve back the progress of the upload? I have in the dirs 4-5 files. I want exact the progress (not the files uploaded/(total files))
EDIT: Thus the right approach is the following:
public int percentage;
try{
// Get an instance of WebClient
WebClient client = new System.Net.WebClient();
// parse the ftp host and file into a uri path for the upload
Uri uri = new Uri(m_FtpHost + new FileInfo(dir).Name);
// set the username and password for the FTP server
client.Credentials = new System.Net.NetworkCredential(m_FtpUsername, m_FtpPassword);
// upload the file asynchronously, non-blocking.
client.UploadProgressChanged += WebClientUploadProgressChanged;
client.UploadFileCompleted += WebClientUploadCompleted;
client.UploadFileAsync(uri, "STOR",dir);
}
void WebClientUploadProgressChanged(object sender, UploadProgressChangedEventArgs e)
{
percentage = e.ProgressPercentage;
}
void WebClientUploadCompleted(object sender, UploadFileCompletedEventArgs e)
{
print( "Upload is finished. ");
}
I add this implementation to my code, however it seems that it doenst print anything in the console.
WebClient contains a dedicated event for this
public event UploadProgressChangedEventHandler UploadProgressChanged
https://msdn.microsoft.com/en-us/library/system.net.webclient.uploadprogresschanged(v=vs.110).aspx
EDIT : HttpWebRequest approach based on a google result :
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.ContentType = "text/plain";
request.Timeout = -1; //Infinite wait for the response.
// Get the file information object.
FileInfo fileInfo = new FileInfo("C:\\Test\\uploadFile.dat");
// Set the file content length.
request.ContentLength = fileInfo.Length;
// Get the number of segments the file will be uploaded to the stream.
int segments = Convert.ToInt32(fileInfo.Length / (1024 * 4));
// Get the source file stream.
using (FileStream fileStream = fileInfo.OpenRead())
{
// Create 4KB buffer which is file page size.
byte[] tempBuffer = new byte[1024 * 4];
int bytesRead = 0;
// Write the source data to the network stream.
using (Stream requestStream = request.GetRequestStream())
{
// Loop till the file content is read completely.
while ((bytesRead = fileStream.Read(tempBuffer, 0, tempBuffer.Length)) > 0)
{
// Write the 4 KB data in the buffer to the network stream.
requestStream.Write(tempBuffer, 0, bytesRead);
// Update your progress bar here using segment count.
}
}
}
// Post the request and Get the response from the server.
using (WebResponse response = request.GetResponse())
{
// Request is successfully posted to the server.
}

Download the first 1000 bytes

I need to download a text file from the internet using C#. The file size can be quite large and the information I need is always within the first 1000 bytes. Is this possible?
Stolen from here.
string GetWebPageContent(string url)
{
string result = string.Empty;
HttpWebRequest request;
const int bytesToGet = 1000;
request = WebRequest.Create(url) as HttpWebRequest;
//get first 1000 bytes
request.AddRange(0, bytesToGet - 1);
// the following code is alternative, you may implement the function after your needs
using (WebResponse response = request.GetResponse())
{
using (Stream stream = response.GetResponseStream())
{
byte[] buffer = new byte[1024];
int read = stream.Read(buffer, 0, 1000);
Array.Resize(ref buffer, read);
return Encoding.ASCII.GetString(buffer);
}
}
}
(Edited as requested in the comments... ;) )
I did this as an answer to your newer question. You could put the range header in too if you want, but I excluded it.
string GetWebPageContent(string url)
{
//string result = string.Empty;
HttpWebRequest request;
const int bytesToGet = 1000;
request = WebRequest.Create(url) as HttpWebRequest;
var buffer = new char[bytesToGet];
using (WebResponse response = request.GetResponse())
{
using (StreamReader sr = new StreamReader(response.GetResponseStream()))
{
sr.Read(buffer, 0, bytesToGet);
}
}
return new string(buffer);
}

Categories