Errors converting form file to memory stream - c#

I am trying to upload a file and save into Azure Blob storage.
The file is injected as a FormFile.
The problem is that, there are errors when I convert the FormFile to a memory stream. The stream then uploads to Azure but with no data contained.
public async Task<IActionResult> Create([Bind("EndorsementId,FileName,ProviderId,Title")] Endorsement endorsement, IFormFile formFile)
{
if (ModelState.IsValid)
{
...
var data = new MemoryStream();
formFile.CopyTo(data);
var buf = new byte[data.Length];
data.Read(buf, 0, buf.Length);
UploadToAzure(data);
...
The errors are on the ReadTimeOut and WriteTimeOut properties of the memory stream. They say 'data.ReadTimeout' threw an exception of type 'System.InvalidOperationException' and 'data.WriteTimeout' threw an exception of type 'System.InvalidOperationException' respectively.
Here is how I injected the FormFile. There seems to be very little information on this.
http://www.mikesdotnetting.com/article/288/uploading-files-with-asp-net-core-1-0-mvc
Thanks in advance.

IFormFile has CopyToAsync method for this purpose. You can just do something like below:
using (var outputStream = await blobReference.OpenWriteAsync())
{
await formFile.CopyToAsync(outputStream, cancellationToken);
}

The offset of the MemoryStream is still at the end of the file after you fill in the data. You can either reset the position:
var data = new MemoryStream();
formFile.CopyTo(data);
// At this point, the Offset is at the end of the MemoryStream
// Either do this to seek to the beginning
data.Seek(0, SeekOrigin.Begin);
var buf = new byte[data.Length];
data.Read(buf, 0, buf.Length);
UploadToAzure(data);
Or, rather than doing all of the work yourself, you can make MemoryStream just copy the data out for you to a byte[] array, by doing this after the CopyTo() call:
// Or, save yourself some work and just do this
// to make MemoryStream do the work for you
UploadToAzure(data.ToArray());

You can also upload content of IFormFile to Azure Blob Storage like this:
using (var stream = formFile.OpenReadStream())
{
var blobServiceClient = new BlobServiceClient(azureBlobConnectionString);
var containerClient = blobServiceClient.GetBlobContainerClient("containerName");
var blobClient = containerClient.GetBlobClient("filename");
await blobClient.UploadAsync(stream);
}

Related

VirusTotal Uploaded File is Zero Bytes

I'm trying to upload a file to VirusTotal using .Net Core.But the uploaded file size is Zero Bytes.Why does this happen?
[Route("api/[controller]")]
public class ScannerController : Controller
{ [HttpGet]
public async Task<VirusTotalNet.Results.FileReport> ScanAsync(string file_id)
{
file_id = "./wwwroot/Upload/node-v12.14.1-x64.msi";
VirusTotal virusTotal = new VirusTotal("");
// virusTotal.UseTLS = true;
FileStream stream = System.IO.File.OpenRead(file_id);
byte[] fileBytes = new byte[stream.Length];
stream.Read(fileBytes, 0, fileBytes.Length);
VirusTotalNet.Results.FileReport report = await virusTotal.GetFileReportAsync(stream);
return report;
}
}
You've read the entire file into a byte[] and there's an overload of GetFileReportAsync that will take that, so change the parameter from stream to fileBytes:
VirusTotalNet.Results.FileReport report = await virusTotal.GetFileReportAsync(fileBytes);
Derviş Kayımbaşıoğlu suggested resetting the stream's position but it turns out that the location mentioned was incorrect. Either of these:
stream.Seek(0L, SeekOrigin.Begin);
// or
stream.Position = 0L;
Needed to be done immediately before calling GetFileReportAsync, after the file had been read, not before. That would've worked.
But wait, there's more!
There's no need to read the file into fileBytes, which means there's no need to reset the position. The stream can be opened and passed directly to GetFileReportAsync. Including proper resource disposal, the entire method becomes this:
[HttpGet]
public async Task<VirusTotalNet.Results.FileReport> ScanAsync(string file_id)
{
file_id = "./wwwroot/Upload/node-v12.14.1-x64.msi";
VirusTotal virusTotal = new VirusTotal("");
// virusTotal.UseTLS = true;
using (FileStream stream = System.IO.File.OpenRead(file_id))
{
VirusTotalNet.Results.FileReport report = await virusTotal.GetFileReportAsync(stream);
return report;
}
}
This allows both the file to be read and the socket to be written asynchronously, and the data can be buffered in small amounts so that large files don't have to be loaded entirely into memory.

ByteArray to IFormFile

I am developing some REST API with C# and Net Core
I have a function in my repository which accepts a parameter of type IFormFile.
public async Task<bool> UploadFile(IFormFile file)
{
// do some stuff and save the file to azure storage
}
This function is called by a controller method which pass it the uploaded file
public class FileController : Controller
{
public async Task<IActionResult> UploadDoc(IFormFile file
{
// Call the repository function to save the file on azure
var res = await documentRepository.UploadFile(file);
}
}
Now I have another function that calls an external API which returns a file as a byte array. I'd like to save this byte array using the repository.UploadFile method but I can't cast the byte array object to IFormFile.
Is it possible?
You can convert the byte array to a MemoryStream:
var stream = new MemoryStream(byteArray);
..and then pass that to the constructor of the FromFile class:
IFormFile file = new FormFile(stream, 0, byteArray.Length, "name", "fileName");
Your repo shouldn't be using IFormFile. That's an abstraction that only applies to one particular method of HTTP file transfer (namely a multipart/form-data encoded request body). Something like your repo should have no knowledge of the source of the file (HTTP), nor how it was transmitted (multipart/form-data vs application/json for example).
Instead, you should use Stream for your param. In your UploadDoc action, then, you can simply do:
using (var stream = file.OpenReadStream())
{
await documentRepository.UploadFile(stream);
}
And, where you have just a byte array:
using (var stream = new MemoryStream(byteArray))
{
await documentRepository.UploadFile(stream);
}
You might also consider adding an overload of UploadFile that takes a byte[], as creating a new memory stream from a byte array just to have a stream is a waste of resources. However, a byte[] has to be handled differently than a Stream, so it may require some duplication of logic to go that route. You'll need to evaluate the tradeoffs.
Create a new MemoryStream based on the byte array.
Create a new FormFile object based on the MemoryStream.
Make sure to append the ContentDisposition header, otherwise you will be unable to operate your FormFile object as a C# exception will be thrown.
The complete code:
using (var stream = new MemoryStream(byteArray))
{
var file = new FormFile(stream, 0, byteArray.Length, name, fileName)
{
Headers = new HeaderDictionary(),
ContentType = contentType,
};
System.Net.Mime.ContentDisposition cd = new System.Net.Mime.ContentDisposition
{
FileName = file.FileName
};
file.ContentDisposition = cd.ToString();
}

MemoryStream throws exception of type InvalidOperationException

I hope that you can help :)
In my MVC.net core 2.2, when debugging a simple:
MemoryStream ms = new MemoryStream();
Right after initialization it gives me a:
ReadTimeout: 'ms.ReadTimeout' threw an exception of type 'System.InvalidOperationException'
WriteTimeout: 'ms.WriteTimeout' threw an exception of type 'System.InvalidOperationException'
Now the solution doesn't crash or anything. But if I inspect the "ms" in Visual Studio, that's what it says.
What I'm trying to do is via SixLabors.ImageSharp do:
IFormFile file = viewModel.File.Image;
using (Image<Rgba32> image = Image.Load(file.OpenReadStream()))
using (var ms = new MemoryStream())
{
image.Mutate(x => x.Resize(1000, 1000));
SixLabors.ImageSharp.Formats.Jpeg.JpegEncoder jpegEncoder =
new SixLabors.ImageSharp.Formats.Jpeg.JpegEncoder();
jpegEncoder.Quality = 80;
image.Save(ms, jpegEncoder);
StorageCredentials storageCredentials = new StorageCredentials("Name", "KeyValue");
// Create cloudstorage account by passing the storagecredentials
CloudStorageAccount storageAccount = new CloudStorageAccount(storageCredentials, true);
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Get reference to the blob container by passing the name by reading the value from the configuration (appsettings.json)
CloudBlobContainer container = blobClient.GetContainerReference("storagefolder");
// Get the reference to the block blob from the container
CloudBlockBlob blockBlob = container.GetBlockBlobReference("image.jpg");
await blockBlob.UploadFromStreamAsync(ms);
}
But the saved stream is empty (when debugging there is values in Capacity, Length and Position. But after uploading it to azure blob storage, the size is 0).
Kind regards
Anda Hendriksen
Write operations to a memory stream aren't atomic, they're buffered for efficiency. You'll need to flush the stream first.
A second issue is, you're starting to copy the memory stream to your output stream, starting from the end of the stream. So, re-position the memory stream to the start.
So, before writing the stream to your output:
ms.Flush();
ms.Position = 0; // or ms.Seek(0, SeekOrigin.Begin);
Then call
await blockBlob.UploadFromStreamAsync(ms);

C# Web.API Returning image. MemoryStream -> StreamContent returns browken image

I need to return from web service an image that is stored at the disk.
In my controller I perform some search operations and send file. Here is my code.
public HttpResponseMessage Get([FromUri]ShowImageRequest req)
{
// .......................
// .......................
// load image file
var imgStream = new MemoryStream();
using (Image image = Image.FromFile(fullImagePath))
{
image.Save(imgStream, ImageFormat.Jpeg);
}
imgStream.Seek(0, SeekOrigin.Begin); // it does not work without this
var res = new HttpResponseMessage(HttpStatusCode.OK);
res.Content = new StreamContent(imgStream);
res.Content.Headers.ContentType = new ediaTypeHeaderValue("image/jpeg");
return res;
}
If I do not add this line, I see in fiddler response body length 0
imgStream.Seek(0, SeekOrigin.Begin);
Otherwise it works. What am I missing and why do I need to do this?
After saving the stream position is at the end. This means that reading from it returns no bytes.
Everyone runs into this exact issue once :)

How to get the stream for a Multipart file in webapi upload?

I need to upload a file using Stream (Azure Blobstorage), and just cannot find out how to get the stream from the object itself. See code below.
I'm new to the WebAPI and have used some examples. I'm getting the files and filedata, but it's not correct type for my methods to upload it. Therefore, I need to get or convert it into a normal Stream, which seems a bit hard at the moment :)
I know I need to use ReadAsStreamAsync().Result in some way, but it crashes in the foreach loop since I'm getting two provider.Contents (first one seems right, second one does not).
[System.Web.Http.HttpPost]
public async Task<HttpResponseMessage> Upload()
{
if (!Request.Content.IsMimeMultipartContent())
{
this.Request.CreateResponse(HttpStatusCode.UnsupportedMediaType);
}
var provider = GetMultipartProvider();
var result = await Request.Content.ReadAsMultipartAsync(provider);
// On upload, files are given a generic name like "BodyPart_26d6abe1-3ae1-416a-9429-b35f15e6e5d5"
// so this is how you can get the original file name
var originalFileName = GetDeserializedFileName(result.FileData.First());
// uploadedFileInfo object will give you some additional stuff like file length,
// creation time, directory name, a few filesystem methods etc..
var uploadedFileInfo = new FileInfo(result.FileData.First().LocalFileName);
// Remove this line as well as GetFormData method if you're not
// sending any form data with your upload request
var fileUploadObj = GetFormData<UploadDataModel>(result);
Stream filestream = null;
using (Stream stream = new MemoryStream())
{
foreach (HttpContent content in provider.Contents)
{
BinaryFormatter bFormatter = new BinaryFormatter();
bFormatter.Serialize(stream, content.ReadAsStreamAsync().Result);
stream.Position = 0;
filestream = stream;
}
}
var storage = new StorageServices();
storage.UploadBlob(filestream, originalFileName);**strong text**
private MultipartFormDataStreamProvider GetMultipartProvider()
{
var uploadFolder = "~/App_Data/Tmp/FileUploads"; // you could put this to web.config
var root = HttpContext.Current.Server.MapPath(uploadFolder);
Directory.CreateDirectory(root);
return new MultipartFormDataStreamProvider(root);
}
This is identical to a dilemma I had a few months ago (capturing the upload stream before the MultipartStreamProvider took over and auto-magically saved the stream to a file). The recommendation was to inherit that class and override the methods ... but that didn't work in my case. :( (I wanted the functionality of both the MultipartFileStreamProvider and MultipartFormDataStreamProvider rolled into one MultipartStreamProvider, without the autosave part).
This might help; here's one written by one of the Web API developers, and this from the same developer.
Hi just wanted to post my answer so if anybody encounters the same issue they can find a solution here itself.
here
MultipartMemoryStreamProvider stream = await this.Request.Content.ReadAsMultipartAsync();
foreach (var st in stream.Contents)
{
var fileBytes = await st.ReadAsByteArrayAsync();
string base64 = Convert.ToBase64String(fileBytes);
var contentHeader = st.Headers;
string filename = contentHeader.ContentDisposition.FileName.Replace("\"", "");
string filetype = contentHeader.ContentType.MediaType;
}
I used MultipartMemoryStreamProvider and got all the details like filename and filetype from the header of content.
Hope this helps someone.

Categories