File in use error after copying with WebClient to FTP - c#

I download a file from a local ftp, with this code:
System.Net.WebClient oClientFTP = new System.Net.WebClient();
oClientFTP.Credentials = new System.Net.NetworkCredential("user", "password");
oClientFTP.DownloadFile("ftp://192.168.0.10/files/test.pdf","test.pdf");
oClientFTP.Dispose();
The file is copied correctly but is not released, anything I try to do tells me that the file is in use by another application. I tried using ProcessExplorer but it didn't solve the problem.
I also tried to copy the file to another file but the problem is the same.
How can I free the file after copying?

I solved it using a stream which I then write to file.
using (MemoryStream stream = new MemoryStream(oClientFTP.DownloadData(cFtp +cNomefile)))
{
using (FileStream outputFileStream = new FileStream(cNomefile, FileMode.Create))
{
stream.CopyTo(outputFileStream);
}
}

Related

Read file from Azure into MemoryStream C#

I have an ini config file located on Azure. I don't want to download this file which is how its currently being handled. I want to read it into a MemoryStream and parse it from there and then have the MemoryStream automatically flush the data.
Is there any way to do this without having to download the file itself onto the local drive?
Current download method is:
myWebClient.DownloadFile("AzureLink", #"C:\\Program Files (x86)\\MyProgram\\downloadedFile.ini")
I assume this is what you're looking for:
WebClient wc = new WebClient();
using (MemoryStream stream = new MemoryStream(wc.DownloadData(url)))
{
//your code in here
}

Download file directly to memory

I would like to load an excel file directly from an ftp site into a memory stream. Then I want to open the file in the FarPoint Spread control using the OpenExcel(Stream) method. My issue is I'm not sure if it's possible to download a file directly into memory. Anyone know if this is possible?
Yes, you can download a file from FTP to memory.
I think you can even pass the Stream from the FTP server to be processed by FarPoint.
WebRequest request = FtpWebRequest.Create("ftp://asd.com/file");
using (WebResponse response = request.GetResponse())
{
Stream responseStream = response.GetResponseStream();
OpenExcel(responseStream);
}
Using WebClient you can do nearly the same. Generally using WebClient is easier but gives you less configuration options and control (eg.: No timeout setting).
WebClient wc = new WebClient();
using (MemoryStream stream = new MemoryStream(wc.DownloadData("ftp://asd.com/file")))
{
OpenExcel(stream);
}
Take a look at WebClient.DownloadData. You should be able to download the file directory to memory and not write it to a file first.
This is untested, but something like:
var spreadSheetStream
= new MemoryStream(new WebClient().DownloadData(yourFilePath));
I'm not familiar with FarPoint though, to say whether or not the stream can be used directly with the OpenExcel method. Online examples show the method being used with a FileStream, but I'd assume any kind of Stream would be accepted.
Download file from URL to memory.
My answer does not exactly show, how to download a file for use in Excel, but shows how to create a generic-purpose in-memory byte array.
private static byte[] DownloadFile(string url)
{
byte[] result = null;
using (WebClient webClient = new WebClient())
{
result = webClient.DownloadData(url);
}
return result;
}

Overwriting local files with remote files causes Unity Standalone Application To Freeze

I am a developer using the Unity Game Engine trying to overwrite local files with files from an FTP Server. I am using the System.IO.File.WriteAllBytes function to do so.
When I start the application and I trigger the code that has been updated, my application will freeze.
In my Windows Form, the code arrives here, it uses the WebClient instance to download the file, and overwrites it if it is larger than the local file:
public void downloadFile (WebClient webClient, string urlAddress,
string location, byte[] localFile)
{
webClient.Proxy = null;
webClient.Credentials = new NetworkCredential("<user>", "<pass>");
byte[] fileData = webClient.DownloadData("ftp://"+ urlAddress);
/*
* Only download if bytes of remote file
* is larger than bytes of local file
*/
if (fileData.Length > localFile.Length)
{
File.WriteAllBytes(location, fileData);
}
}
Using FileStream causes the application to freeze just as well.
FileStream _FileStream = new FileStream(location, FileMode.Create, FileAccess.Write);
_FileStream.Write(fileData, 0, fileData.Length);
_FileStream.Close();
And I also tried writing all the files that needed updating to a Temporary folder, and then using File.Copy
What am I suppose to be doing to properly overwrite files?

How to avoid the 0-byte file on Webclient download error

When using the following code to download a file:
WebClient wc = new WebClient();
wc.DownloadFileCompleted += new System.ComponentModel.AsyncCompletedEventHandler(wc_DownloadFileCompleted);
wc.DownloadFileAsync("http://path/file, "localpath/file");
and an error occurs during the download (no internet connection, file not found, etc.)
it allocates a 0-byte file in localpath/file which can get quite annoying.
is there a way to avoid that in a clean way?
(i already just probe for 0 byte files on a download error and delete it, but i dont think that is the recommended solution)
If you reverse engineer the code for WebClient.DownloadFile you will see that the FileStream is instantiated before the download even begins. This is why the file will be created even if the download fails. There's no way to ammend that code so you should cosnider a different approach.
There are many ways to approach this problem. Consider using WebClient.DownloadData rather than WebClient.DownloadFile and only creating or writing to a file when the download is complete and you are sure you have the data you want.
WebClient client = new WebClient();
client.DownloadDataCompleted += (sender, eventArgs) =>
{
byte[] fileData = eventArgs.Result;
//did you receive the data successfully? Place your own condition here.
using (FileStream fileStream = new FileStream("C:\\Users\\Alex\\Desktop\\Data.rar", FileMode.Create))
fileStream.Write(fileData, 0, fileData.Length);
};
client.DownloadDataAsync(address);
client.Dispose();

Unzipping a .gz file using C#

I have a tarred gunzip file called ZippedXmls.tar.gz which has 2 xmls inside it.
I need to programmatically unzip this file and the output should be 2 xmls copied in a folder.
How do I achieve this using C#?
I've used .Net's built-in GZipStream for gzipping byte streams and it works just fine. I suspect that your files are tarred first, before being gzipped.
You've asked for code, so here's a sample, assuming you have a single file that is zipped:
FileStream stream = new FileStream("output.xml", FileMode.Create); // this is the output
GZipStream uncompressed = new GZipStream(stream, CompressionMode.Decompress);
uncompressed.Write(bytes,0,bytes.Length); // write all compressed bytes
uncompressed.Flush();
uncompressed.Close();
stream.Dispose();
Edit:
You've changed your question so that the file is a tar.gz file - technically my answer is not applicable to your situation, but I'll leave it here for folks who want to handle .gz files.
sharpziplib should be able to do this
I know this question is ancient, but search engines redirect here for how to extract gzip in C#, so I thought I'd provide a slightly more recent example:
using (var inputFileStream = new FileStream("c:\\myfile.xml.gz", FileMode.Open))
using (var gzipStream = new GZipStream(inputFileStream, CompressionMode.Decompress))
using (var outputFileStream = new FileStream("c:\\myfile.xml", FileMode.Create))
{
await gzipStream.CopyToAsync(outputFileStream);
}
For what should be the simpler question of how to untar see: Decompress tar files using C#

Categories