Overwriting local files with remote files causes Unity Standalone Application To Freeze - c#

I am a developer using the Unity Game Engine trying to overwrite local files with files from an FTP Server. I am using the System.IO.File.WriteAllBytes function to do so.
When I start the application and I trigger the code that has been updated, my application will freeze.
In my Windows Form, the code arrives here, it uses the WebClient instance to download the file, and overwrites it if it is larger than the local file:
public void downloadFile (WebClient webClient, string urlAddress,
string location, byte[] localFile)
{
webClient.Proxy = null;
webClient.Credentials = new NetworkCredential("<user>", "<pass>");
byte[] fileData = webClient.DownloadData("ftp://"+ urlAddress);
/*
* Only download if bytes of remote file
* is larger than bytes of local file
*/
if (fileData.Length > localFile.Length)
{
File.WriteAllBytes(location, fileData);
}
}
Using FileStream causes the application to freeze just as well.
FileStream _FileStream = new FileStream(location, FileMode.Create, FileAccess.Write);
_FileStream.Write(fileData, 0, fileData.Length);
_FileStream.Close();
And I also tried writing all the files that needed updating to a Temporary folder, and then using File.Copy
What am I suppose to be doing to properly overwrite files?

Related

File in use error after copying with WebClient to FTP

I download a file from a local ftp, with this code:
System.Net.WebClient oClientFTP = new System.Net.WebClient();
oClientFTP.Credentials = new System.Net.NetworkCredential("user", "password");
oClientFTP.DownloadFile("ftp://192.168.0.10/files/test.pdf","test.pdf");
oClientFTP.Dispose();
The file is copied correctly but is not released, anything I try to do tells me that the file is in use by another application. I tried using ProcessExplorer but it didn't solve the problem.
I also tried to copy the file to another file but the problem is the same.
How can I free the file after copying?
I solved it using a stream which I then write to file.
using (MemoryStream stream = new MemoryStream(oClientFTP.DownloadData(cFtp +cNomefile)))
{
using (FileStream outputFileStream = new FileStream(cNomefile, FileMode.Create))
{
stream.CopyTo(outputFileStream);
}
}

Read file from Azure into MemoryStream C#

I have an ini config file located on Azure. I don't want to download this file which is how its currently being handled. I want to read it into a MemoryStream and parse it from there and then have the MemoryStream automatically flush the data.
Is there any way to do this without having to download the file itself onto the local drive?
Current download method is:
myWebClient.DownloadFile("AzureLink", #"C:\\Program Files (x86)\\MyProgram\\downloadedFile.ini")
I assume this is what you're looking for:
WebClient wc = new WebClient();
using (MemoryStream stream = new MemoryStream(wc.DownloadData(url)))
{
//your code in here
}

Unzip a LARGE zip file in Azure File Storage w/o "Out of Memory" exception"

Here's what I'm dealing with...
Some process (out of our control) will occasionally drop a zip file into a directory in Azure File Storage. That directory name is InBound. So let's say a file called bigbook.zip is dropped into the InBound folder.
I need to create an Azure Function App that runs every 5 minutes and looks for zip files in the InBound directory. If any exists, then one-by-one, we create a new directory by the same name as the zip file in another directory (called InProcess). So in our example, I would create InProcess/bigbook.
Now inside InProcess/bigbook, I need to unzip bigbook.zip. So by the time the process is done running InProcess/bigbook will contain all the contents of bigbook.zip.
Please note: This function I am creating is a Console App that will run as an Azure Function App. So there will be no file system access (at least, as far as I'm aware, anyway.) There is no option to download the zip file, unzip it, and then move the contents.
I am having a devil of a time figuring out how to do this in memory only. No matter what I try, I keep running into an Out Of Memory exception. For now, I am just doing this on my localhost running in debug in Visual Studio 2017, .NET 4.7. In that setting, I am not able to convert the test zip file, which is 515,069KB.
This was my first attempt:
private async Task<MemoryStream> GetMemoryStreamAsync(CloudFile inBoundfile)
{
MemoryStream memstream = new MemoryStream();
await inBoundfile.DownloadToStreamAsync(memstream).ConfigureAwait(false);
return memstream;
}
And this (with high hopes) was my second attempt, thinking that DownloadRangeToStream would work better than just DownloadToStream.
private MemoryStream GetMemoryStreamByRange(CloudFile inBoundfile)
{
MemoryStream outPutStream = new MemoryStream();
inBoundfile.FetchAttributes();
int bufferLength = 1 * 1024 * 1024;//1 MB chunk
long blobRemainingLength = inBoundfile.Properties.Length;
long offset = 0;
while (blobRemainingLength > 0)
{
long chunkLength = (long)Math.Min(bufferLength, blobRemainingLength);
using (var ms = new MemoryStream())
{
inBoundfile.DownloadRangeToStream(ms, offset, chunkLength);
lock (outPutStream)
{
outPutStream.Position = offset;
var bytes = ms.ToArray();
outPutStream.Write(bytes, 0, bytes.Length);
}
}
offset += chunkLength;
blobRemainingLength -= chunkLength;
}
return outPutStream;
}
But either way, I am running into memory issues. I presume it's because the MemoryStream I am trying to create gets too large?
How else can I tackle this? And again, downloading the zip file is not an option, as the app will ultimately be an Azure Function App. I'm also pretty sure that using a FileStream isn't an option either, as that requires a local file path, which I don't have. (I only have a remote Azure URL)
Could I somehow create a temp file in the same Azure Storage account that the zip file is in, and stream the zip file to that temp file instead of to a memory stream? (Thinking out loud.)
The goal is to get the stream into a ZipArchive using:
ZipArchive archive = new ZipArchive(stream)
And from there I can extract all the contents. But getting to that point w/o memory errors is proving a real bugger.
Any ideas?
Using Azure Storage File Share this is the only way it worked for me without loading the entire ZIP into Memory. I tested with a 3GB ZIP File (with thousands of files or with a big file inside) and Memory/CPU was low and stable. I hope it helps!
var zipFiles = _directory.ListFilesAndDirectories()
.OfType<CloudFile>()
.Where(x => x.Name.ToLower().Contains(".zip"))
.ToList();
foreach (var zipFile in zipFiles)
{
using (var zipArchive = new ZipArchive(zipFile.OpenRead()))
{
foreach (var entry in zipArchive.Entries)
{
if (entry.Length > 0)
{
CloudFile extractedFile = _directory.GetFileReference(entry.Name);
using (var entryStream = entry.Open())
{
byte[] buffer = new byte[16 * 1024];
using (var ms = extractedFile.OpenWrite(entry.Length))
{
int read;
while ((read = entryStream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
}
}
}
}
}
}
I would suggest you use memory snapshots to see why you are running out of memory within Visual Studio. You can use the tutorial in this article to find the culprit. Doing local development with a smaller file may help you continue to work if your machine is simply running out of memory.
When it comes to doing this within Azure, a node in the Consumption plan is limited to 1.5GB of total memory. If you expect to receive files larger than that then you should look at one of the other App Service plans that give you more memory to work with.
It is possible to store files within the function's local directory, so that is an option. You can't guaruntee that you will be using the same local directory between executions, but this should work as long as you are using the file you downloaded within the same execution.

Using webclient to download images from deployed website

i deployed a website on IIS running on localhost/xxx/xxx.aspx . On my WPF side , i download a textfile using webclient from the localhost server and save it at my wpf app folder
this is how i do it :
protected void DownloadData(string strFileUrlToDownload)
{
WebClient client = new WebClient();
byte[] myDataBuffer = client.DownloadData(strFileUrlToDownload);
MemoryStream storeStream = new MemoryStream();
storeStream.SetLength(myDataBuffer.Length);
storeStream.Write(myDataBuffer, 0 , (int)storeStream.Length);
storeStream.Flush();
string currentpath = System.IO.Directory.GetCurrentDirectory() + #"\Folder";
using (FileStream file = new FileStream(currentpath, FileMode.Create, System.IO.FileAccess.ReadWrite))
{
byte[] bytes = new byte[storeStream.Length];
storeStream.Read(bytes, 0, (int)storeStream.Length);
file.Write(myDataBuffer, 0, (int)storeStream.Length);
storeStream.Close();
}
//The below Getstring method to get data in raw format and manipulate it as per requirement
string download = Encoding.ASCII.GetString(myDataBuffer);
}
This is by writing btyes and saving them . But how do i download multiple image files and save it on my WPF app folder? I have a URL like this localhost/websitename/folder/designs/ , under this URL , there is many images , how do i download all of them ? and save it on WPF app folder?
Basically i want to download the contents of the folder whereby the contents are actually images.
First, the WebClient class already has a method for this. Use something like client.DownloadFile(remoteUrl, localFilePath).
See this link:
WebClient.DownloadFile Method (String, String)
Secondly, you will need to index the files you want to download on the server somehow. You can't just get a directory listing over HTTP and then loop through it. The web server will need to be configured to enable directory listing, or you will need a page to generate a directory listing. Then you will need to download the results of that page as a string using WebClient.DownloadString and parse it. A simple solution would be an aspx page that outputs a plaintext list of files in the directory you want to download.
Finally, in the code you posted you're saving every single file you download as a file named "Folder". You need to generate a unique filename for each file you want to download. When you're looping through the files you want to download, use something like:
string localFilePath = Path.Combine("MyDownloadFolder", imageName);
where imageName is a unique filename (with file extension) for that file.

Byte array written to isolated storage area file in C# Windows Phone 7 app is invalid

I have a C# Windows Phone 7.1 app that downloads a PDF file from a foreign web server and then (tries) to save it to the isolated storage area as a file. I have tried several different ways to get this done, but the file always ends up about 30% too large and when I open it up in a text editor, instead of seeing the USUAL 'PDF' characters at the start of the file followed by the encoded characters, I see basically junk. The test file I'm using is supposed to be 161k but when I view the file with the Isolated Storage Explorer, it's 271k.
First I download the file to a string. I inspected the string at this point in the debugger and it does contain the proper values and it is the correct length. The trouble happens when I try to write it to the isolated storage area. I tried both StreamWriter & BinaryWriter with identical invalid results. The contents of the resulting file appears to be a long stream of junk characters. Note, I am deleting the file if it exists just in case, before writing out the contents. Below is my code using the BinaryWriter version. What is wrong?
async public static Task URLToFileAsync(
string strUrl,
string strDestFilename,
IProgress<int> progress,
CancellationToken cancelToken)
{
strUrl = strUrl.Trim();
if (String.IsNullOrWhiteSpace(strUrl))
throw new ArgumentException("(Misc::URLToFileAsync) The URL is empty.");
strDestFilename = strDestFilename.Trim();
if (String.IsNullOrWhiteSpace(strDestFilename))
throw new ArgumentException("(Misc::URLToFileAsync) The destination file name is empty.");
// Create the isolated storage file.
// FileStream fs = Misc.CreateIsolatedStorageFileStream(strDestFilename);
IsolatedStorageFile isoStorage = IsolatedStorageFile.GetUserStoreForApplication();
// Delete the file first.
if (isoStorage.FileExists(strDestFilename))
isoStorage.DeleteFile(strDestFilename);
IsolatedStorageFileStream theIsoStream = isoStorage.OpenFile(strDestFilename, FileMode.Create);
FileStream fs = theIsoStream;
// If the stream writer is NULL, then the file could not be created.
if (fs == null)
throw new System.IO.IOException("(Misc::URLToFileAsync) Error creating or writing to the file named: " + strDestFilename);
BinaryWriter bw = new BinaryWriter(fs);
try
{
// Call URLToStringAsync() to get the web file as a string first.
string strFileContents = await URLToStringAsync(strUrl, progress, cancelToken);
// >>>> NOTE: strFileContents looks correct and is the correct size.
// Operation cancelled?
if (!safeCancellationCheck(cancelToken))
{
// Note. BinaryWriter does not have an Async method so we take the hit here
// to do a synchronous operation.
// See this Stack Overflow post.
// http://stackoverflow.com/questions/10315316/asynchronous-binaryreader-and-binarywriter-in-net
// >>>> NOTE: strFileContents.ToCharArray() looks correct and is the correct length.
bw.Write(strFileContents.ToCharArray(), 0, strFileContents.Length);
} // if (safeCancellationCheck(cancelToken))
}
finally
{
// Make sure the file is cleaned up.
bw.Flush();
bw.Close();
// Make sure the file is disposed.
bw.Dispose();
} // try/finally
// >>>> NOTE: output file in Isolated Storage Explorer is the wrong size and contains apparently junk.
} // async public static void URLToFileAsync
You cannot download a binary into a string. The result will not be correct, as you have found out.
See this answer, which demonstrates how to download a binary file to isolated storage: https://stackoverflow.com/a/6909201/1822514

Categories