Download Blob as a file in ASP.NET MVC - c#

I am saving a file to MySQL database as blob. Could you please help me, how can I download this blob as a file? I have blob data and ContentType in the database. You can see my method for downloading below. I have been searching for over a week, but I couldn't make it. I also don't know that I can download directly over method or I need to write ajax. I highly appreciate your help and assistance. Thanks a lot!
Method:
public HttpPostedFileBase Indir()
{
using (ISession session=FluentNHibernateHelper.OpenSession())
{
var doc = new Document();
var docDet = new DocumentDetail();
doc = session.Query<Document>().FirstOrDefault(x => x.Id == 5);
docDet = session.Query<DocumentDetail>().FirstOrDefault(x => x.DocumentId == doc.Id);
var test = new MemoryPostedFile(docDet.File, doc.DocumentName, doc.DocumentExtention);
return test;
}
}
Class:
public class MemoryPostedFile:HttpPostedFileBase
{
private readonly byte[] fileBytes;
public MemoryPostedFile(byte[] fileBytes, string fileName = null, string ContentType = null)
{
this.fileBytes = fileBytes;
this.FileName = fileName;
this.ContentType = ContentType;
this.InputStream = new MemoryStream(fileBytes);
}
public override int ContentLength => fileBytes.Length;
public override string FileName { get; }
public override string ContentType { get; }
public override Stream InputStream { get; }
}

Thank you so much for your answers! I found the solution and I am writing the code block below.
public HttpPostedFileBase Indir(string documentId)
{
using (ISession session=FluentNHibernateHelper.OpenSession())
{
var doc = new Document();
var docDet = new DocumentDetail();
doc = session.Query<Document>().FirstOrDefault(x => x.Id == Convert.ToInt32(documentId));
docDet = session.Query<DocumentDetail>().FirstOrDefault(x=>x.DocumentId==doc.Id);
var test = new MemoryPostedFile(docDet.File,doc.DocumentName,doc.DocumentExtention);
Response.Clear();
Response.ContentType = doc.DocumentExtention;//"application/octet-stream";
Response.AddHeader("Content-Disposition","attachment; filename="+test.FileName+";");
Response.BinaryWrite(docDet.File);
Server.MapPath("~/"+test.FileName);
Response.End();
return test;
}
}

Related

Problems with returning zip file

I am writing an ASP.NET Core Web API with .NET 5.0 as an exercise.
In MyController.cs there is the method DownloadZip(). Here, it should be possible for the client to download a zip file. By the way, I create a zip file because I did not achieve to transfer multiple pictures. That is the actual goal. Provisionally, the zip file is still stored in the picture folder. Of course, that should not happen either. I simply still have difficulties with web services and transferring zip files via them.
Anyway, in the line return File(fullName, "text/plain"); I get the following error message:
System.InvalidOperationException: No file provider has been configured to process the supplied file.
I found several threads on StackOverflow last Friday about how to transfer a zip file using a memory stream. When I do it this way, the browser shows the individual bytes, but no finished file has been downloaded.
Postings is a list(of post) with
using System;
using System.Collections.Generic;
namespace ImageRepository
{
public sealed class Posting
{
public DateTime CreationTime { get; set; }
public List<ImageProperties> Imageproperties { get; }
public Posting(DateTime creationTime, List<ImageProperties> imPr)
{
CreationTime = creationTime;
Imageproperties = imPr;
}
}
}
And Imageproperties is the following:
namespace ImageRepository
{
public sealed class ImageProperties
{
public string FullName { get; set; }
public string _Name { get; set; }
public byte[] DataBytes { get; set; }
public ImageProperties(string FullName, string Name, byte[] dataBytes)
{
this.FullName = FullName;
this._Name = Name;
this.DataBytes = dataBytes;
}
}
}
MyController.cs
using Microsoft.AspNetCore.Mvc;
using System.Collections.Generic;
using ImageRepository;
using System.IO.Compression;
namespace WebApp2.Controllers
{
[Route("api/[controller]")]
[ApiController]
public class MyController : ControllerBase
{
private readonly IImageTransferRepository imageRepository;
private readonly System.Globalization.CultureInfo Deu = new System.Globalization.CultureInfo("de-DE");
public MyController(IImageTransferRepository imageTransferRepository)
{
this.imageRepository = imageTransferRepository;
}
//––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––
[HttpGet("WhatAreTheNamesOfTheLatestPictures")] // Route will be https://localhost:44355/api/My/WhatAreTheNamesOfTheLatestPictures/
public ActionResult GetNamesOfNewestPosting()
{
List<string> imageNames = this.imageRepository.GetImageNames();
if (imageNames.Count == 0)
{
return NoContent();
}
return Ok(imageNames);
}
//––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––
[HttpGet("ImagesOfLatestPost")] //route will be https://localhost:44355/api/My/ImagesOfLatestPost
public ActionResult DownloadZip()
{
List<Posting> Postings = this.imageRepository.GetImages();
if (Postings is null || Postings.Count == 0)
{
return NoContent();
}
System.DateTime now = System.DateTime.Now;
string now_as_string = now.ToString("G", Deu).Replace(':', '-');
string folderPath = System.Environment.GetFolderPath(System.Environment.SpecialFolder.MyPictures);
string fullName = $"{folderPath}\\{now_as_string}.zip";
using (ZipArchive newFile = ZipFile.Open(fullName, ZipArchiveMode.Create))
{
for (int i = 0; i < Postings[0].Imageproperties.Count; i++)
{
newFile.CreateEntryFromFile(Postings[0].Imageproperties[i].FullName,
Postings[0].Imageproperties[i]._Name);
}
}
return File(fullName, "text/plain");
}
}
}
Edit June 20, 2022, 4:16 pm
Based on Bagus Tesa's comment, I wrote the following:
byte[] zip_as_ByteArray = System.IO.File.ReadAllBytes(fullName);
return File(zip_as_ByteArray, "application/zip");
The automatic download takes place, but I still have to rename the file by attaching (a) .zip so that Windows recognises it as a zip file.
Furthermore, there is still the problem that I am still creating the zip file on the hard disk (using (ZipArchive newFile = ZipFile.Open(fullName, ZipArchiveMode.Create))). How can I change this?
Thanks to the thread linked by Bagus Tesa, I can now answer my question. I have adapted a few things to my needs, see for-loop, because I have several images.
[HttpGet("ImagesOfLatestPost")] //route will be https://localhost:44355/api/My/ImagesOfLatestPost
public ActionResult DownloadZip()
{
List<Posting> Postings = this.imageRepository.GetImages();
if (Postings is null || Postings.Count == 0)
{
return NoContent();
}
byte[] compressedBytes;
using (var outStream = new System.IO.MemoryStream())
{
using (var archive = new ZipArchive(outStream, ZipArchiveMode.Create, true))
{
for (int i = 0; i < Postings[0].Imageproperties.Count; i++)
{
ZipArchiveEntry fileInArchive = archive.CreateEntry(Postings[0].Imageproperties[i]._Name, CompressionLevel.Optimal);
using System.IO.Stream entryStream = fileInArchive.Open();
using System.IO.MemoryStream fileToCompressStream = new System.IO.MemoryStream(Postings[0].Imageproperties[i].DataBytes);
fileToCompressStream.CopyTo(entryStream);
}
}
compressedBytes = outStream.ToArray();
}
return File(compressedBytes, "application/zip", $"Export_{System.DateTime.Now:yyyyMMddhhmmss}.zip");
}

c# Send image from WPF to WebAPI

I have a WebAPI 2.1 service (ASP.Net MVC 4) that receive and image and related data.
I need to send this image from WPF application, but I get 404 not found error.
Server side
[HttpPost]
[Route("api/StoreImage")]
public string StoreImage(string id, string tr, string image)
{
// Store image on server...
return "OK";
}
Client side
public bool SendData(decimal id, int time, byte[] image)
{
string url = "http://localhost:12345/api/StoreImage";
var wc = new WebClient();
wc.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
var parameters = new NameValueCollection()
{
{ "id", id.ToString() },
{ "tr", time.ToString() },
{ "image", Convert.ToBase64String(image) }
};
var res=wc.UploadValues(url, "POST", parameters);
return true;
}
The url exists, I thing I need to encode to json format, but I don't know how.
Thanks for your time!
The method parameters in your case are received in QueryString form.
I would suggest you turn the parameters list into one single object like this:
public class PhotoUploadRequest
{
public string id;
public string tr;
public string image;
}
Then in you API convert the string to buffer from Base64String like this:
var buffer = Convert.FromBase64String(request.image);
Then cast it to HttpPostedFileBase
HttpPostedFileBase objFile = (HttpPostedFileBase)new MemoryPostedFile(buffer);
Now you have the image file. Do whatever you want.
Full Code here:
[HttpPost]
[Route("api/StoreImage")]
public string StoreImage(PhotoUploadRequest request)
{
var buffer = Convert.FromBase64String(request.image);
HttpPostedFileBase objFile = (HttpPostedFileBase)new MemoryPostedFile(buffer);
//Do whatever you want with filename and its binaray data.
try
{
if (objFile != null && objFile.ContentLength > 0)
{
string path = "Set your desired path and file name";
objFile.SaveAs(path);
//Don't Forget to save path to DB
}
}
catch (Exception ex)
{
//HANDLE EXCEPTION
}
return "OK";
}
Edit:
I forgot to add the Code for MemoryPostedFile class
public class MemoryPostedFile : HttpPostedFileBase
{
private readonly byte[] fileBytes;
public MemoryPostedFile(byte[] fileBytes, string fileName = null)
{
this.fileBytes = fileBytes;
this.FileName = fileName;
this.InputStream = new MemoryStream(fileBytes);
}
public override void SaveAs(string filename)
{
File.WriteAllBytes(filename, fileBytes);
}
public override string ContentType => base.ContentType;
public override int ContentLength => fileBytes.Length;
public override string FileName { get; }
public override Stream InputStream { get; }
}

ObjectDisposedException when trying to upload a file

I have this service class:
public delegate string AsyncMethodCaller(string id, HttpPostedFileBase file);
public class ObjectService : IDisposable
{
private readonly IObjectRepository repository;
private readonly IAmazonS3 client;
private readonly string bucketName;
private static object syncRoot = new object();
private static IDictionary<string, int> processStatus { get; set; }
public ObjectService(string accessKey, string secretKey, string bucketName)
{
var credentials = new BasicAWSCredentials(accessKey, secretKey);
this.bucketName = bucketName;
this.client = new AmazonS3Client(credentials, RegionEndpoint.EUWest1);
this.repository = new ObjectRepository(this.client, this.bucketName);
if (processStatus == null)
processStatus = new Dictionary<string, int>();
}
public IList<S3Object> GetAll()
{
return this.repository.GetAll();
}
public S3Object Get(string key)
{
return this.GetAll().Where(model => model.Key.Equals(key, StringComparison.OrdinalIgnoreCase)).SingleOrDefault();
}
/// <summary>
/// Note: You can upload objects of up to 5 GB in size in a single operation. For objects greater than 5 GB you must use the multipart upload API.
/// Using the multipart upload API you can upload objects up to 5 TB each. For more information, see http://docs.aws.amazon.com/AmazonS3/latest/dev/uploadobjusingmpu.html.
/// </summary>
/// <param name="id">Unique id for tracking the upload progress</param>
/// <param name="bucketName">The name of the bucket that the object is being uploaded to</param>
/// <param name="file">The file that will be uploaded</param>
/// <returns>The unique id</returns>
public string Upload(string id, HttpPostedFileBase file)
{
var reader = new BinaryReader(file.InputStream);
var data = reader.ReadBytes((int)file.InputStream.Length);
var stream = new MemoryStream(data);
var utility = new TransferUtility(client);
var request = new TransferUtilityUploadRequest()
{
BucketName = this.bucketName,
Key = file.FileName,
InputStream = stream
};
request.UploadProgressEvent += (sender, e) => request_UploadProgressEvent(sender, e, id);
utility.Upload(request);
return id;
}
private void request_UploadProgressEvent(object sender, UploadProgressArgs e, string id)
{
lock (syncRoot)
{
processStatus[id] = e.PercentDone;
}
}
public void Add(string id)
{
lock (syncRoot)
{
processStatus.Add(id, 0);
}
}
public void Remove(string id)
{
lock (syncRoot)
{
processStatus.Remove(id);
}
}
public int GetStatus(string id)
{
lock (syncRoot)
{
if (processStatus.Keys.Count(x => x == id) == 1)
{
return processStatus[id];
}
else
{
return 100;
}
}
}
public void Dispose()
{
this.repository.Dispose();
this.client.Dispose();
}
}
and my controller looks like this:
public class _UploadController : Controller
{
public void StartUpload(string id, HttpPostedFileBase file)
{
var bucketName = CompanyProvider.CurrentCompanyId();
using (var service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], bucketName))
{
service.Add(id);
var caller = new AsyncMethodCaller(service.Upload);
var result = caller.BeginInvoke(id, file, new AsyncCallback(CompleteUpload), caller);
}
}
public void CompleteUpload(IAsyncResult result)
{
var caller = (AsyncMethodCaller)result.AsyncState;
var id = caller.EndInvoke(result);
}
//
// GET: /_Upload/GetCurrentProgress
public JsonResult GetCurrentProgress(string id)
{
try
{
var bucketName = CompanyProvider.CurrentCompanyId();
this.ControllerContext.HttpContext.Response.AddHeader("cache-control", "no-cache");
using (var service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], bucketName))
{
return new JsonResult { Data = new { success = true, progress = service.GetStatus(id) } };
}
}
catch (Exception ex)
{
return new JsonResult { Data = new { success = false, error = ex.Message } };
}
}
}
Now, I have found that sometimes, I get the error ObjectDisposedException when trying to upload a file on this line: var data = reader.ReadBytes((int)file.InputStream.Length);. I read that I should not be using the using keyword because of the asynchronous calls but it still seems to be disposing the stream.
Can anyone tell me why?
Update 1
I have changed my controller to this:
private ObjectService service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], CompanyProvider.CurrentCompanyId());
public void StartUpload(string id, HttpPostedFileBase file)
{
service.Add(id);
var caller = new AsyncMethodCaller(service.Upload);
var result = caller.BeginInvoke(id, file, new AsyncCallback(CompleteUpload), caller);
}
public void CompleteUpload(IAsyncResult result)
{
var caller = (AsyncMethodCaller)result.AsyncState;
var id = caller.EndInvoke(result);
this.service.Dispose();
}
but I am still getting the error on the file.InputStream line.
Update 2
The problem seems to be with the BinaryReader.
I changed the code to look like this:
var inputStream = file.InputStream;
var i = inputStream.Length;
var n = (int)i;
using (var reader = new BinaryReader(inputStream))
{
var data = reader.ReadBytes(n);
var stream = new MemoryStream(data);
var request = new TransferUtilityUploadRequest()
{
BucketName = this.bucketName,
Key = file.FileName,
InputStream = stream
};
try
{
request.UploadProgressEvent += (sender, e) => request_UploadProgressEvent(sender, e, id);
utility.Upload(request);
}
catch
{
file.InputStream.Dispose(); // Close our stream
}
}
If the upload fails and I try to re-upload the item, that is when the error is thrown. It is like the item is locked or something.
You are disposing the service with the using statement when you are calling the service with BeginInvoke.
using (var service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], bucketName))
{
service.Add(id);
var caller = new AsyncMethodCaller(service.Upload);
var result = caller.BeginInvoke(id, file, new AsyncCallback(CompleteUpload), caller);
}
You have to dispose your service when the job is done:
var service = new ObjectService(ConfigurationManager.AppSettings["AWSAccessKey"], ConfigurationManager.AppSettings["AWSSecretKey"], bucketName)
public void StartUpload(string id, HttpPostedFileBase file)
{
var bucketName = CompanyProvider.CurrentCompanyId();
service.Add(id);
var caller = new AsyncMethodCaller(service.Upload);
var result = caller.BeginInvoke(id, file, new AsyncCallback(CompleteUpload), caller);
}
public void CompleteUpload(IAsyncResult result)
{
var caller = (AsyncMethodCaller)result.AsyncState;
var id = caller.EndInvoke(result);
service.Close();
service.Dispose();
}
Also your file might be corrupted, try this code:
byte[] buffer = new byte[file.InputStream.Length];
file.InputStream.Seek(0, SeekOrigin.Begin);
file.InputStream.Read(buffer, 0, file.InputStream.Length);

Read an Excel spreadsheet in memory

How can I read an Excel spreadsheet that was just posted to my server?
I searched for something but I only found how to read an Excel spreadsheet with the file name path which is not my case.
I need something like that:
public ActionResult Import(HttpPostedFileBase file)
{
var excel = new ExcelQueryFactory(file); //using linq to excel
}
I was running into your same issue but I didn't want to switch to a paid service so this is what I did.
public class DataImportHelper : IDisposable
{
private readonly string _fileName;
private readonly string _tempFilePath;
public DataImportHelper(HttpPostedFileBase file, string tempFilePath)
{
_fileName = file.FileName;
_tempFilePath = Path.Combine(tempFilePath, _fileName);
(new FileInfo(_tempFilePath)).Directory.Create();
file.SaveAs(_tempFilePath);
}
public IQueryable<T> All<T>(string sheetName = "")
{
if (string.IsNullOrEmpty(sheetName))
{
sheetName = (typeof (T)).Name;
}
var excelSheet = new ExcelQueryFactory(_tempFilePath);
return from t in excelSheet.Worksheet<T>(sheetName)
select t;
}
public void Dispose()
{
File.Delete(_tempFilePath);
}
}
Here is a Test
[Fact]
public void AcceptsAMemoryStream()
{
MemoryFile file;
using (var f = File.OpenRead("SampleData.xlsx"))
{
file = new MemoryFile(f, "multipart/form-data", "SampleData.xlsx");
using (var importer = new DataImportHelper(file, "Temp/"))
{
var products = importer.All<Product>();
Assert.NotEmpty(products);
}
}
}
Here is MemoryFile.cs. This file is only used for testing. It is just an implementation of HttpPostedFileBase so you can test your controllers and my little helper. This was borrowed from another post.
public class MemoryFile : HttpPostedFileBase
{
Stream stream;
string contentType;
string fileName;
public MemoryFile(Stream stream, string contentType, string fileName)
{
this.stream = stream;
this.contentType = contentType;
this.fileName = fileName;
}
public override int ContentLength
{
get { return (int)stream.Length; }
}
public override string ContentType
{
get { return contentType; }
}
public override string FileName
{
get { return fileName; }
}
public override Stream InputStream
{
get { return stream; }
}
public override void SaveAs(string filename)
{
using (var file = File.Open(filename, FileMode.Create))
stream.CopyTo(file);
}
}
Unfortunately it's not possible to read a spreadsheet from a stream with LinqToExcel.
That's because it uses OLEDB to read from the spreadsheets and it can't read from a stream.
You can use the InputStream property of HttpPostedFileBase to read the excel spreadsheet in memory.
I use ClosedXML nuget package to read excel content from stream which is available in your case. It has a simple overload which takes stream pointing to stream for the excel file (aka workbook).
imported namespaces at the top of the code file:
using ClosedXML.Excel;
Source code:
public ActionResult Import(HttpPostedFileBase file)
{
//HttpPostedFileBase directly is of no use so commented your code
//var excel = new ExcelQueryFactory(file); //using linq to excel
var stream = file.InputStream;
if (stream.Length != 0)
{
//handle the stream here
using (XLWorkbook excelWorkbook = new XLWorkbook(stream))
{
var name = excelWorkbook.Worksheet(1).Name;
//do more things whatever you like as you now have a handle to the entire workbook.
var firstRow = excelWorkbook.Worksheet(1).Row(1);
}
}
}
You need Office Interops assemblies. Check the Excel Object Model for reference.

Creating MTOM and Deserializing it

I have been using some code to create MTOM by using code from MSDN.
It seems that there is an error and I cannot understand where the problem lies as one of the users on the forum pointed out that there is an error.
The file (JPEG) data get corrupted after a de-serialization. The complete code is listed below.
public class Post_7cb0ff86_5fe1_4266_afac_bcb91eaca5ec
{
[DataContract()]
public partial class TestAttachment
{
private byte[] fileField;
private string filenameField;
[DataMember()]
public byte[] File
{
get
{
return this.fileField;
}
set
{
this.fileField = value;
}
}
[DataMember()]
public string Filename
{
get
{
return this.filenameField;
}
set
{
this.filenameField = value;
}
}
}
public static void Test()
{
string Filename = "Image.jpg";
byte[] file = File.ReadAllBytes(Filename);
TestAttachment Attachment = new TestAttachment();
Attachment.Filename = Filename;
Attachment.File = file;
MemoryStream MTOMInMemory = new MemoryStream();
XmlDictionaryWriter TW = XmlDictionaryWriter.CreateMtomWriter(MTOMInMemory, Encoding.UTF8, Int32.MaxValue, "");
DataContractSerializer DCS = new DataContractSerializer(Attachment.GetType());
DCS.WriteObject(TW, Attachment);
TW.Flush();
Console.WriteLine(Encoding.UTF8.GetString(MTOMInMemory.ToArray()));
var v = DeserializeMTOMMessage(Encoding.UTF8.GetString(MTOMInMemory.ToArray()));
File.WriteAllBytes(v.Filename,v.File);
}
public static TestAttachment DeserializeMTOMMessage(string MTOMMessage)
{
try
{
MemoryStream MTOMMessageInMemory = new MemoryStream(UTF8Encoding.UTF8.GetBytes(MTOMMessage));
XmlDictionaryReader TR = XmlDictionaryReader.CreateMtomReader(MTOMMessageInMemory, Encoding.UTF8, XmlDictionaryReaderQuotas.Max);
DataContractSerializer DCS = new DataContractSerializer(typeof(TestAttachment));
return (TestAttachment)DCS.ReadObject(TR);
}
catch
{
return null;
}
}
}
I would be grateful if someone can help me in pointing out where the problem is. I am new to XOP/MTOM and find it hard to track down where the error might be. Either serialization or de-serialization.
Thank you
There's a bug in your code.
Change your method call
MTOMInMemory.Position = 0;
DeserializeMTOMMessage(Encoding.UTF8.GetString(MTOMInMemory.ToArray()));
to
DeserializeMTOMMessage(MTOMInMemory.ToArray())
and the implementation to
public static TestAttachment DeserializeMTOMMessage(byte[] MTOMMessage)
{
try
{
MemoryStream MTOMMessageInMemory = new MemoryStream(MTOMMessage);
XmlDictionaryReader TR = XmlDictionaryReader.CreateMtomReader(MTOMMessageInMemory, Encoding.UTF8, XmlDictionaryReaderQuotas.Max);
DataContractSerializer DCS = new DataContractSerializer(typeof(TestAttachment));
return (TestAttachment)DCS.ReadObject(TR);
}
catch
{
return null;
}
}
what you've done was some double conversion from utf8 to byte array and vice versa, which ended up creating not the original byte array you were using

Categories