I'm working on a simple blazor application that receives a file upload and stores it. I am using BlazorInputFile and I can't work out why copying the stream to MemoryStream is causing the browser to freeze.
The details of how to use (and how it's implemented) BlazorInputFile are explained in this blog post: Uploading Files in Blazor.
var ms = new MemoryStream();
await file.Data.CopyToAsync(ms); // With a 1MB file, this line took 3 seconds, and froze the browser
status = $"Finished loading {file.Size} bytes from {file.Name}";
Sample project/repo: https://github.com/paulallington/BlazorInputFileIssue
(this is just the default Blazor app, with BlazorInputFile implemented as per the article)
Use await Task.Delay(1); as mentioned on Zhi Lv's comment in this post blazor-webassembly upload file can't show progress?
var buffer = new byte[imageFile.Size];
await Task.Delay(1);
await imageFile.OpenReadStream(Int64.MaxValue).ReadAsync(buffer);
pratica.Files.Add(new FilePraticaRequest()
{
Contenuto = buffer,
Nome = imageFile.Name,
});
StateHasChanged();
I've experienced the same issue. I've tried both predefined components such as Steve Sanderssons file upload and MatBlazor fileupload and also my own component to handle fileuploads. Small files are not a problem. Once the files are a bit larger in size the UI will hang itself. MemoryOutOfBoundsException (or similar). So no, async/await can't help you release the UI.
I have put so much effort into this issue, one solution, that I am currently using, is to do all fileuploads with javascript instead of blazor. Just use javascript to get the file and post it up to the server. No JSInterop..
However, it seems like it is a memory issue in webassembly mono.
Read more here: https://github.com/dotnet/aspnetcore/issues/15777
Note: I haven't tried this on the latest Blazor version. So I'm not sure it's fixed or not.
You wait for the copy result, so the app freeze, you can refactor your code like this;
var ms = new MemoryStream();
file.Data.CopyToAsync(ms).ContinueWith(async task =>
{
if (task.Exception != null)
{
throw task.Exception; // Update this by your convenience
}
status = $"Finished loading {file.Size} bytes from {file.Name}";
await InvokeAsync(StateHasChanged).ConfigureAwait(false); // informs the component the status changed
}; // With a 1MB file, this line took 3 seconds, and should not froze the browser
Related
Currently, I download byte arrays as files using JsInterop.
Here is my JS file:
function downloadFile(fileName, base64String) {
const url = "data:application/octet-stream;base64," + base64String;
const anchorElement = document.createElement('a');
anchorElement.href = url;
anchorElement.download = fileName ?? '';
anchorElement.click();
anchorElement.remove();
}
And here is a method in my razor component:
async Task DownloadFile(byte[] file)
{
ShowMessage("Start");
await JSRuntime.InvokeVoidAsync("downloadFile", "FileName", Convert.ToBase64String(file));
ShowMessage("End");
}
This code works, and I am able to download files. My issue is that I cannot implement the progress bar, or even show the loading spinner because await JSRuntime has no idea about an actual file download size and its progress. JSRuntime only launches the process of downloading and immediately continues to the next line of code.
In the code above ShowMessage("Start") and ShowMessage("End") are both shown one after another as soon as I click the download button, but the file in the browser downloads much later (depending on the file size).
How may I await the download process and execute relevant code only when the file has been downloaded? And it would be even better if I could read downloaded bytes to show a progress bar with percentages.
Update: for test purposes, I upload the file from the browser and store it in a byte[] variable. Then I download the same file from the variable using JS. Even though I store the file in the memory, it still takes time to download the file. I suppose that when I store a file in memory, it is already on my PC (client), and should download immediately. But instead, my window gets frozen for the duration of downloading the file. Tested on 6 - 11- 20 MB files. The bigger file, the more I have to wait for it to download.
I suggest you should be show message ShowMessage("Start") and ShowMessage("End"); in function downloadFile at JS
Does anyone know how to use the C# OneDrive SDK to perform a resumable upload?
When I use IDriveItemRequestBuilder.CreateUploadSession I always get a new session with the NextExpectedRanges reset.
If I use the .UploadURL and manually send a HTTP Post I get the correct, next ranges back however I don't then know the means to resume the upload session using the sdk. There doesn't seem to be a means from the API to 'OpenUploadSession', or at least that I can find.
Nor can I find a working example.
I suspect this must be a common use case.
Please note that keywords in the text - resumable.
I was looking for the same thing and just stepped on an example from the official docs:
https://learn.microsoft.com/en-us/graph/sdks/large-file-upload?tabs=csharp.
I tried the code and it worked.
In case, my sample implementation: https://github.com/xiaomi7732/onedrive-sample-apibrowser-dotnet/blob/6639444d6298492c38f841e411066635760930c2/OneDriveApiBrowser/FormBrowser.cs#L565
The method of resumption depends on how much state you have. The absolution minimum that is required is UploadSession.UploadUrl (think of it as unique identifier for the session). If you don't have that URL you'd need to create a new upload session and start from the beginning, otherwise if you do have it you can do something like the following to resume:
var uploadSession = new UploadSession
{
NextExpectedRanges = Enumerable.Empty<string>(),
UploadUrl = persistedUploadUrl,
};
var maxChunkSize = 320 * 1024; // 320 KB - Change this to your chunk size. 5MB is the default.
var provider = new ChunkedUploadProvider(uploadSession, graphClient, ms, maxChunkSize);
// This will query the service and make sure the remaining ranges are accurate.
uploadSession = await provider.UpdateSessionStatusAsync();
// Since the remaining ranges is now accurate, this will return the requests required to
// complete the upload.
var chunkRequests = provider.GetUploadChunkRequests();
...
If you have more state you'd be able to skip some of the above. For example, if you already had a ChunkedUploadProvider but don't know that it's accurate (maybe it was serialized to disk or something) then you can just start the process with the call to UpdateSessionStatusAsync.
FYI, you can see the code for ChunkedUploadProvider here in case that'll be helpful to see what's going on under the covers.
So I am trying to upload files in my mvc core and it works, however, sometime the file being uploaded is like corrupted or size 0 bytes. Does anyone ever experienced something like this?
The initial file size being uploaded is not large at all.
Thanks.
public IActionResult UploadCertificateFile(IFormFile UploadCertificate)
{
var parsedContentDisposition =
ContentDispositionHeaderValue.Parse(UploadCertificate.ContentDisposition);
var filename = Path.Combine(_hostingEnvironment.WebRootPath,
"Uploads", parsedContentDisposition.FileName.Trim('"'));
using (var stream = System.IO.File.OpenWrite(filename))
{
UploadCertificate.CopyToAsync(stream);
}
return RedirectToAction("Index");
}
I think the problem here is using CopyToAsync and not calling await, I think you should make your method declaration async Task and call await in UploadCertificate.CopyToAsync line, or just use CopyTo (but is better for I/O to make it async)
File might not be copied fully, when you thread is killed, so this is why your file is corrupted.
I'm creating a mockup file upload tool for a community site using Fine Uploader.
I've got the session set up to retrieve the initial files from the server along with a thumbnail url.
It all works great, however the rendering of the thumbnails is really slow.
I can't work out why. So I hard-coded to use a very small thumbnail for each of the four files. This made no difference.
The server side not the issue. The information is coming back very quickly.
Am I doing something wrong? Why is fineuploader so slow? Here's screen grab. It's taking four seconds to render the four thumbnails.
I'm using latest chrome. It's a NancyFX project on a fairly powerful machine. Rending other pages with big images on them is snappy.
Client side code:
thumbnails: {
placeholders: {
waitingPath: '/Content/js/fine-uploader/placeholders/waiting-generic.png',
notAvailablePath: '/Content/js/fine-uploader/placeholders/not_available-generic.png'
}
},
session: {
endpoint: "/getfiles/FlickaId/342"
},
Server side code:
// Fine uploader makes session request to get existing files
Get["/getfiles/FlickaId/{FlickaId}"] = parameters =>
{
//get the image files from the server
var i = FilesDatabase.GetFlickaImagesById(parameters.FlickaId);
// list to hold the files
var list = new List<UploadedFiles>();
// build the response data object list
foreach (var imageFile in i)
{
var f = new UploadedFiles();
f.name = "test-thumb-small.jpg"; // imageFile.ImageFileName;
f.size = 1;
f.uuid = imageFile.FileGuid;
f.thumbnailUrl = "/Content/images/flickabase/thumbnails/" + "test-thumb-small.jpg"; // imageFile.ImageFileName;
list.Add(f);
}
return Response.AsJson(list); // our model is serialised by Nancy as Json!
};
This is by design, and was implemented both to prevent the UI thread from being flooded with the image scaling logic and to prevent a memory leak issue specific to Chrome. This is explained in the thumbnails and previews section of the documentation, specifically in the "performance considerations" area:
For browsers that support client-generated image previews (qq.supportedFeatures.imagePreviews === true), a configurable pause between template-generated previews is in effect. This is to prevent the complex process of generating previews from overwhelming the client machine's CPU for a lengthy amount of time. Without this limit in place, the browser's UI thread runs the risk of blocking, preventing any user interaction (scrolling, etc) until all previews have been generated.
You can adjust or remove this pause via the thumbnails option, but I suggest you not do this unless you are sure users will not drop a large number of complex image files.
I'm testing BackgroundTransfer on Windows Phone 8.1 RT and I'm facing strange issue, when I try to download a file to one of the KnownFolders.
The code is as follow:
string address = #"http://www.onereason.org/archive/wp-content/uploads/2012/02/universe.jpg";
StorageFile tempFile = await KnownFolders.PicturesLibrary.CreateFileAsync("temp.jpg", CreationCollisionOption.ReplaceExisting);
BackgroundDownloader manager = new BackgroundDownloader();
var operation = manager.CreateDownload(new Uri(address), tempFile);
IProgress<DownloadOperation> progressH = new Progress<DownloadOperation>((p) =>
{ Debug.WriteLine("Transferred: {0}, Total: {1}", p.Progress.BytesReceived, p.Progress.TotalBytesToReceive); });
await operation.StartAsync().AsTask(progressH);
Debug.WriteLine("BacgroundTransfer created");
It's quite simple and works if I download to ApplicationData.Current.LocalFolder, but if I do it like above, the transfer never completes, though the progresshandler says all bytes have been received:
The code never reaches the line Debug.WriteLine("BacgroundTransfer created"); and if I take a look at proccesses on the phone, I can see that RuntimeBroker is using the CPU at 100%:
obviously it's also continuing its work after you finish debugging the app and the phone becomes hotter and hotter. The fastest way to cancel this situation is to uninstall the app, as with this action all corresponding background transfers are cancelled.
All the necessary capabilieties are being set. I can for example download the file to LocalFolder and then copy to KnownFolder, but it's additional redundant action. Is there a way to download a file directly to KnownFolder? Have I missed something?